Throughout history, we have witnessed periods of significant technological advancements that have revolutionised society. The Neolithic Revolution saw a shift from the nomadic hunter-gatherer lifestyle to settled agricultural societies. The first Industrial Revolution brought about industrial and machine-based production, and the second saw the rise of mass production, improved communication, and transportation, along with an increased standard of living.
We are now in the age of information, biotechnology, and the renewable energy revolution. We are witnessing, and will continue to witness, massive leaps forward in technology bringing with them significant social and cultural changes. This is all happening at a speed never seen before.
Historically, the development of materials, chemicals, pharmaceuticals, and inventions was a slow and painstaking process that took many years. Advancements would often be incremental and involve time-consuming trial-and-error methods. In a 2011 UK Government paper on Strategy for UK Life Sciences, it reported that, “It now takes an average of $1 billion and 20 years to develop a new drug…”
Over the years, several factors have contributed to the acceleration in the speed of development, including collaboration and information sharing. Open communication within the scientific community allows researchers to build on each other’s work and avoid redundant efforts. Serendipity can also play a role in advancement, with breakthroughs occurring while researching something else. However, advances in tools and techniques, most notably AI, have played a crucial role in the rapid advancements we are now beginning to see.
Artificial Intelligence is creating materials, technologies, and developing drugs at a rate we’ve never seen before. Wynne-Jones IP Director and Patent Attorney Dr Elliott Davies said, “We’re now seeing breakthroughs in technology at an unprecedented rate. We’re no longer inching forward at the pace of a tortoise; we’ve become the hare.”
Earlier this month, the BBC reported that, using AI technology, Microsoft and the Pacific Northwest National Laboratory (PNNL) had created a new substance that could potentially reduce the use of lithium in batteries by up to 70%. Jason Zander, Executive Vice President of Microsoft, told the BBC that using AI technology, they believe they will be able to “compress 250 years of scientific discovery into the next 25.”
This new material has the potential to be a sustainable and more environmentally friendly alternative to lithium. With global warming becoming an increasing concern, developments like this mean that AI could provide the planet-saving answers we desperately need.
Artificial Intelligence also isn’t limited to saving the world. It could also develop the hardware it needs to advance itself. Much of the current limitations on computing technology comes from the limitations of storing, transmitting, and analysing vast data. The brain of AI is the computer chip, and for the grey matter of AI to become more advanced, the computer chip needs the ability to accommodate it.
Enter the graphene semiconductor. This new development in semiconductor technology has 10 times the mobility of silicon and could power future quantum computers. Now, while it doesn’t appear that AI has been involved in this particular recent development, the super-computer this semiconductor could power might prove to be the machine that has the ability to advance itself. What could that mean for the future? We’ll leave you to think about that one.