Building a startup. Also doing a PhD at Sorbonne Université and an MBA at Collège des Ingénieurs. Writing featured on TheNextWeb, HP Enterprise, and Built In.

OPINION

As Python’s lifetime grinds to a halt, a hot new competitor is emerging

Woman with hat covering her face in front of sunset
If Julia is still a mystery to you, don’t worry. Photo by Julia Caesar on Unsplash

Don’t get me wrong. Python’s popularity is still backed by a rock-solid community of computer scientists, data scientists and AI specialists.

But if you’ve ever been at a dinner table with these people, you also know how much they rant about the weaknesses of Python. From being slow to requiring excessive testing, to producing runtime errors despite prior testing — there’s enough to be pissed off about.

Which is why more and more programmers are adopting other languages — the top players being Julia, Go, and Rust. …


Devs are devs, you say? Think again. There’s a fierce battle going on between web and game development over programming paradigms, best practices and fundamental approaches to work and life.

Game developer shooting bullets at web developer, who’s holding up a web page
The clash is deeper than you’d think. Image by author

If you think that web development is an enormously lucrative field, you’re not wrong. In 2020, developers working on projects ranging from simple blogging sites to complex social media platforms in the United States earned $40 billion. The video game industry is similarly flush with cash, currently enjoying a market value of $60 billion in the U.S. alone.

Despite their similar sizes, the industries couldn’t be more different. You can book competent web dev services from a fairly skilled high schooler. The skillset of a game developer, however, is way more advanced. Instead of creating a bunch of static sites…


Classical computing isn’t going away, but quantum technology has the potential to disrupt many industries. It’s crucial to leverage the strengths of both to unlock quantum’s full potential.

Computer screen showing an atom boxing a computer screen showing “101100”
Quantum computers are already beating classical computers in many applications. Image by author

The promises of quantum computing are plentiful: It could help develop lifesaving drugs with unprecedented speed, build better investment portfolios for finance and usher in a new era of cryptography. Does that mean quantum computing will become the standard and classical computing will become obsolete?

The short answer is no. Classical computers have unique qualities that will be hard for quantum computers to attain. The ability to store data, for example, is unique to classical computers since the memory of quantum computers only lasts a few hundred microseconds at most.

Additionally, quantum computers need to be kept at temperatures close…


Are we really driving innovation, or are we instilling false hope in a whole generation?

Two young men in business suits shaking hands
Are we really cultivating entrepreneurship, or is it just a show? Photo by Sebastian Herrmann on Unsplash

Medium’s largest publication is The Startup, with more than 700,000 followers. Hundreds of thousands of people who are curious about startups, have created one themselves or plan to build one in the future.

Hundreds of thousands of people who want to change the world in a myriad of ways. Hundreds of thousands of people with a vision and the motivation to work for it. Hundreds of thousands of people who want to reinvent things.

Entrepreneurs, so the story goes, are a rare species. According to that story, most people enjoy the comfort of their corporate jobs and their steady salaries…


Opinion

Companies are lacking leadership support, effective communication between teams, and accessible data

Man sitting in front of computer screen which depicts a brain with a label “not approved”
Most machine learning models never get deployed. Image by author

Corporations are going through rough times. And I’m not talking about the pandemic and the stock market volatility.

The times are uncertain, and having to make customer experiences more and more seamless and immersive isn’t taking off any of the pressure on companies. In that light, it’s understandable that they’re pouring billions of dollars into the development of machine learning models to improve their products.

But there’s a problem. Companies can’t just throw money at data scientists and machine learning engineers, and hope that magic happens.

The data speaks for itself. As VentureBeat reports, around 90 percent of machine learning…


Forgotten to close a brace or to add an indent? Many bugs have a simple fix

Man in front of computer screen with Python on it
Python doesn’t need to be your enemy. Image by author

Fail fast, fail early — we’ve all heard the motto. Still, it’s frustrating when you’ve written a beautiful piece of code, just to realize that it doesn’t work as you’d expected.

That’s where unit tests come in. Checking each piece of your code helps you localize and fix your bugs.

But not all bugs are created the same. Some bugs are unexpected, not obvious to see at all, and hard to fix even for experienced developers. These are more likely to occur in large and complex projects, and spotting them early can save you a ton of time later on.


And why you might consider switching if you’re dealing with front-end web, or back-end Node development

Woman sitting in front of laptop gazing upwards
Python and TypeScript are among the most-loved programming languages. Photo by Obi Onyeador on Unsplash

Python is my bread-and-butter, and I love it. Even though I’ve got some points of criticism against the language, I strongly recommend it for anybody starting out in data science. More experienced people in the field tend to be Python-evangelists anyway.

However, this doesn’t mean that you can’t challenge the limits in your field from time to time, for example by exploring a different programming paradigm or a new language.

The list of Python’s competitors is long: Rust, Go, Scala, Haskell, Julia, Swift, C++, Java, and R all find an entry on it. …


When it comes to code, many scientists lack rigor. That carelessness could have disastrous effects on our ability to achieve reliable breakthroughs in a range of fields.

Man sitting in front of laptop with headphones on
Software in science can be frustratingly messy. Photo by Wes Hicks on Unsplash

Science is messy. That’s why there’s some truth to the cliché of the scatterbrained professor. Not all scientists are like that, of course, but new theories and ideas are often birthed from downright chaos.

At the same time, many scientists are very clean and diligent when it comes to everyday tasks like keeping their desks tidy or their email inboxes uncluttered. This may sound paradoxical, but it really isn’t. All that mental mess needs to be managed somehow, and orderliness is a good strategy for keeping it contained within clear boundaries.

They set those boundaries by insisting on tidiness for…


The boffin fallacy doesn’t hold true anymore.

Image for post
Photo credit: Christina @ wocintechchat.com

Data nerds, computer geeks, science morons, I’m speaking to you. It’s the ever-prevailing cliché: the antisocial introverts who spend their days hacking away at some nerdy project that nobody understands. The freaks that push the frontiers of tech every day but still can’t keep up with the Kardashians.

The cliché goes further. If techies lack basic human skills like communicating effectively or cracking a funny joke, then they won’t make good managers. And don’t even think of appointing such people as a CEO.

Of course, this is a stereotype. Most techies I know — including myself — are interesting, multi-faceted…


Opinion

Why you won’t lose your job though

Two men sitting at laptops working
Is software development a bullshit job? I don’t think so. Photo by Annie Spratt on Unsplash

In 1930, John Maynard Keynes predicted that we’d be having 15-hour workweeks by the end of the century. But by the time it was 2013, it was clear that the great economist had gotten something wrong.

Welcome to the era of bullshit jobs, as anthropologist David Graeber coined it. Since the 1930s, whole new industries have sprung up, which don’t necessarily add much value to our lives. Graeber would probably call most jobs in software development bullshit.

I don’t share Graeber’s opinion, especially when it comes to software. But he does touch an interesting point: as more and more processes…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store