Much of the technological progress of the 20th century is a consequence of the understanding of the quantum mechanical world that has been developed since the beginning of the 20th century. The behavior of semiconductors, lasers, superconductors, nuclear reactors, etc. cannot be understood without a quantum mechanical description, and the period can appropriately be described as the first quantum revolution. However, for practical purposes, classical or semi-classical models has been sufficient to engineer ever more sophisticated applications of these technologies. The underlying quantum mechanical principles have played a relatively minor role in the engineering process and the final end products.
The semiconductor industry has for more than half a century made exponential progress described my Moore’s law and the International Technology Roadmap for Semiconductors targets 5 nm technology in 2021. With a lattice spacing of 0.54 nm for Silicon, this means that industry scale manufacturing is reaching length scales that are as small as ten atoms wide. At the same time the increased computational power and development of more efficient algorithms now allow researchers to simulate the behavior of thousands of atoms starting from quantum mechanical principles, while experimentally ever more complex structures can be built up atom by atom. An impressive demonstration of the later fact can be seen in IBM’s movie “A Boy And His Atom: The World’s Smallest Movie”.
The length scales accessible for industrial applications and the length scales accessible to fundamental quantum mechanical research is therefore rapidly converging. This is the second quantum revolution, where new technology emerge that fundamentally requires a quantum mechanical description to be taken into account in the engineering department as well as manifests quantum mechanical behavior in the end products themselves.
The second quantum revolution is already well on its way. This is not least clear from the many companies that are leaping into the field of quantum computation. IBM has launched a primitive five qubit quantum computer that is possible to interact with online and has recently announced a 50 qubit quantum computer. Microsoft has through Station Q partnered with Universities around the world to develop new quantum computing devices, and recently released a software development kit and the programming language Q# for programming and simulating quantum computers. Recently the European Union also announced a €1 billion project called the Quantum Technology Flagship to kickstart the quantum technology industry in Europe.
The second quantum revolution means that quantum mechanical calculations and experiments will move from being performed in small and relatively independent research groups, to an industrial scale that puts strong requirements on the scalability of the methods used. This will create new job opportunities for physicists and require effective communication of quantum mechanical concepts to a broad engineering community not necessarily familiar with the scientific literature. It also puts strong requirements on the software development practices used to perform quantum mechanical calculations, especially the ability to seamlessly integrate different software components written by experts in different fields.
second-tech.com aims to address the first problem by providing resources and a meeting place for quantum physicists, engineers, etc. to collectively build a knowledge base centered around quantum technology. The blog covers recent advances in the field, the Q&A section allows for the community to exchange ideas, and the job section provides a space for employers a employees to meet.
The sister site second-quantization.com provides documentation for the source development kit TBTK, which aims to address the software scalability issue by providing reusable data structures for second-quantized models.
Let me know your thoughts in the comments below.