Sign in

Building a startup. Also doing a PhD at Sorbonne Université and an MBA at Collège des Ingénieurs. Writing featured on TheNextWeb, HP Enterprise, and Built In.


As Python’s lifetime grinds to a halt, a hot new competitor is emerging

Woman with hat covering her face in front of sunset
If Julia is still a mystery to you, don’t worry. Photo by Julia Caesar on Unsplash

Don’t get me wrong. Python’s popularity is still backed by a rock-solid community of computer scientists, data scientists and AI specialists.

But if you’ve ever been at a dinner table with these people, you also know how much they rant about the weaknesses of Python. From being slow to requiring excessive testing, to producing runtime errors despite prior testing — there’s enough to be pissed off about.

Which is why more and more programmers are adopting other languages — the top players being Julia, Go, and Rust. …

And why that’s a good thing, if it’s done right

Cartoon of a woman saying “data data data” to a robot. The robot responds “0414 0414 0414”
Fake data will become more and more common — and useful. Image by author

Life on earth is getting more and more fake. Real images are becoming photoshopped. Real human interactions have been replaced by faces on screens during the pandemic — faces of people who might not have been wearing pants during the call. Now, it seems, even our precious data is getting replaced by fakes.

Philosophers argue to and fro why this is morally wrong and why this will destroy human society. Historians remind us of the 2016 U.S. election and explain how fake news is the source of all evil. …

Journals are retracting more and more papers because they’re not by the authors they claim to be

Person in lab coat holding glass with liquid labeled “fake” in right hand and smaller glass with liquid labeled “real” in left, and thinking “Oh no…”
The amount of low-quality (and no-quality) papers in science is increasing. Image by author

The practice of science involves trying to find things out about the world by using rigid logic and testing every assumption. Researchers then write up any important findings in papers and submit them for possible publication. After a peer-review process, in which other scientists check that the research is sound, journals publish papers for public consumption.

You might therefore reasonably believe that published papers are quite reliable and meet high-quality standards. You might expect small mistakes that got overlooked during peer review, but no major blunders. It’s science, after all!

You’d be wrong in expecting this, though. Real and good…

Scientists are on a quest to find dark matter. Without quantum tech, the search might take 10,000 years — literally.

Two computer screens with arms and legs running. The one showing a quantum system is outpacing the one showing zeros and ones
In some areas, quantum computing is already outpacing classical computers. Image by author

Almost a century ago, Dutch astronomer Jacobus Kapteyn first proposed the existence of dark matter. He’d been studying the motion of stars in galaxies — a galaxy can be described, in rough terms, as a heap of stars, gas and dust rotating around a common center — and noticed that something was off. The stars in the outer layers of the galaxy were rotating much too fast to conform with the laws of gravity. Kapteyn’s hypothesis was that some invisible, massive stuff might be in and around the galaxy, making the outer stars reach the observed velocities.

From the 1960s…


Replacing workers with AI is, paradoxically, creating demand for more workers

Person with t-shirt saying “AI” reaching for bags saying “job”
AI is taking a lot of jobs. But it’s also creating new ones. Image by author

Since the beginning of large-scale industrialization, automation has led to massive, widespread job losses. Whole cities like Detroit fell victim to this when the car industry replaced large numbers of humans with robots. Today, many of the hardest-hit places are barely a shadow of their bustling and blossoming past selves.

Adding robots to the economy displaces workers. A study from the University of Chicago found that adding just one machine per 1,000 workers to the economy causes the employment rate to decrease by at least 0.18 percent. This may not sound like much, but in a country the size of…

We might need to let go of the limitations of human thinking

AI network and human brain parting at a crossroads
AI and human brains might need to part ways. Image by author

In the summer of 1956, 10 scientists met at Dartmouth College and invented artificial intelligence. Researchers from fields like mathematics, engineering, psychology, economics and political science got together to find out whether they could describe learning and human thinking so precisely that it could be replicated with a machine. Hardly a decade later, these same scientists contributed to dramatic breakthroughs in robotics, natural language processing and computer vision.

Although a lot of time has passed since then, robotics, natural language processing and computer vision remain some of the hottest research areas to this day. …

With the help of some incredibly slow computers it became a success anyway

Angry teacher kicking screen showing “<broken/buggy/code>”
It took a schoolteacher and lots of failures to make C what it is today. Image by author

If you thought that C is the kind of language that only 60-year-old white men know, think again. Yeah, it’s the dinosaur among today’s programming languages. But it’s still alive and kicking in more areas than you’d think.

For one, Unix is written in C. Originally written in assembly, the Unix kernel was rewritten in C back in 1973. This made Unix a lot more portable across different machines, and helped make it popular. …

Tips and Tricks

Analyze, test, and re-use your code with little more than an @ symbol

Fairy with wand flying around Python code
Nothing in software is magic. But decorators come quite close! Image by author

If there’s one thing that makes Python incredibly successful, that would be its readability. Everything else hinges on that: if code is unreadable, it’s hard to maintain. It’s also not beginner-friendly then — a novice getting boggled by unreadable code won’t attempt writing its own one day.

Python was already readable and beginner-friendly before decorators came around. But as the language started getting used for more and more things, Python developers felt the need for more and more features, without cluttering the landscape and making code unreadable.

Decorators are a prime-time example of a perfectly implemented feature. It does take…

Devs are devs, you say? Think again. There’s a fierce battle going on between web and game development over programming paradigms, best practices and fundamental approaches to work and life.

Game developer shooting bullets at web developer, who’s holding up a web page
The clash is deeper than you’d think. Image by author

If you think that web development is an enormously lucrative field, you’re not wrong. In 2020, developers working on projects ranging from simple blogging sites to complex social media platforms in the United States earned $40 billion. The video game industry is similarly flush with cash, currently enjoying a market value of $60 billion in the U.S. alone.

Despite their similar sizes, the industries couldn’t be more different. You can book competent web dev services from a fairly skilled high schooler. The skillset of a game developer, however, is way more advanced. Instead of creating a bunch of static sites…

Classical computing isn’t going away, but quantum technology has the potential to disrupt many industries. It’s crucial to leverage the strengths of both to unlock quantum’s full potential.

Computer screen showing an atom boxing a computer screen showing “101100”
Quantum computers are already beating classical computers in many applications. Image by author

The promises of quantum computing are plentiful: It could help develop lifesaving drugs with unprecedented speed, build better investment portfolios for finance and usher in a new era of cryptography. Does that mean quantum computing will become the standard and classical computing will become obsolete?

The short answer is no. Classical computers have unique qualities that will be hard for quantum computers to attain. The ability to store data, for example, is unique to classical computers since the memory of quantum computers only lasts a few hundred microseconds at most.

Additionally, quantum computers need to be kept at temperatures close…

Rhea Moutafis

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store