What is so difficult about building a quantum computer anyway?

NV1.jpg

When talking about quantum computing, we typically focus on “what is it?”, “why do we want it?” and the “when do we get it?” ..... maybe even “how much will it cost?”. An equally valid question to ask is “why don’t we have one already?” Although it sounds like Veruca Salt has suddenly developed an interest in quantum technology, there are some interesting physics and engineering issues behind why building a quantum computer is fundamentally more difficult than any computer or electronic engineering that has gone before. After all, the phone in my pocket has literally billions of transistors all switching on and off synchronised to a clock which ticks more than a billion times per second. That is enough ticks and tocks, and bits and blocks to make even Dr. Seuss proud - and yet we practically take it for granted.

Given this phenomenal ability to design and manufacture such extraordinary devices, how hard could it be to wire up a few thousands qubits in order to discover the next wonder-drug, or read the odd foreign dictator’s email?

Well, there are several important differences between conventional (classical) computers and quantum computers. To understand these differences, lets first discuss the fundamental components of a quantum computer.  During the late 90s, David DiVincenzo proposed a list of “criteria” that any proposed quantum computer would have to fulfil to be useful. These criteria can be summarised as follows:

1. You must be able to build qubits and build them in a way that allows you “scale up” to thousands or millions of qubits for a full quantum computer.
2. You must be able to initialise these qubits in some known state.
3. These qubits must have long decoherence times.
4. You must be able to apply operations or gates to these qubits which are “universal”.
5. You must be able to measure (at least some of) the qubits.

(Note for the technical specialist: I know there are sometimes two additional criteria added regarding “flying qubits” but given modern error correction codes, I consider these to now be largely unnecessary or subsumed into the other five).

The first and probably biggest problem is that we don’t even know what to make our quantum computer out of. Unlike classical bits which can take one of two states (0 or 1), a qubit must be described by two numbers describing its population and phase. In particular, the population can take on any value from 0 to 1. Although the requirements for a qubit sounds exotic, there are many examples of such systems in nature and we can now manufacture many more. As a consequence, there are many many proposals in the literature on how to build a quantum computer. Recently the scientific community has narrowed this down to a few leading candidates; including ions held in electromagnetic traps, superconducting circuits, semiconductor devices (defects, donors or quantum dots) and photonic schemes using modes of light. All these approaches (and more) can in principle satisfy the DiVincenzo criteria but the devil is in the detail and this is what currently limits progress for each of these approaches. 

Qubit scalability.
Most of us have baked a cake at some point in our lives. If you are good at it, it might take 15 minutes to get set up, mix everything, put it in the pan, take it out at the end - plus perhaps 45 mins cooking time. What about if you wanted to bake two cakes? You don’t have to get all the ingredients out twice, both cakes can probably cook in the oven at the same time, you know the recipe well, you might even be able to mix both cakes in the same bowl (if they are the same flavour!). All up, two cakes might take a total of 70 minutes - considerably less than 2x(15+45)=120 minutes. Now what about 10 cakes? Bakers do this regularly, they have larger ovens, industrial mixers, cooling racks etc. What about 100 cakes? 1000? 100000?

This is the issue of large scale manufacturing. When you have to produce thousands or millions of copies of a particular item or product, the manufacturing process has to be redesigned from the ground up. Most of the quantum computing applications which are relevant to humanity right now (quantum chemistry, decryption etc.) require many thousands if not millions of qubits, so any would-be quantum computer manufacturer better have a credible plan for creating such large numbers of controllable qubits. This doesn’t just mean “stamping” them out one by one, but you must be able to fabricate them in a large scale way as well as calibrate and control them. All the existing quantum designs are based on “small scale” experiments done in labs around the world. However, turning these small scale experiments into full engineering solutions for a large scale quantum computer takes time, planning, testing and expertise in a technology which is exceptionally new and untested.

Decoherence.
If you buy two cheap plastic clocks, set them to the same time and then place them at opposite ends of your house, over time they will slowly drift our of synchronisation. They have slightly different mechanisms, their batteries might have slightly different levels of charge, they might even be in a warmer or colder part of the house. All these differences result in very slightly different speed of rotation of the hands, and so they speed up or slow down relative to each other.

Notice, I explicitly said cheap plastic clocks at either end of the house. If they are beautifully build grandparent clocks resting on a common wood floor, then far more interesting physics is at play.

Now imagine there are 100 cheap clocks. After a while there will be a large range of different times and so all the clocks disagree, however you might imagine that the average time is still approximately right. During operation of a quantum computer, the phase which is used to describe the state of each qubit varies as a function of time. If there are two qubits which have slightly different environments, their phase varies more quickly or more slowly and they get “out of sync”. This effect physicists refer to as dephasing, or more generally decoherence. We say that the “coherence” of the qubits is lost over time. 

Unfortunately, coherence is essential for a conventional quantum computer to function (I say conventional here as its currently less clear how important coherence is in annealing machines but this is an entire topic in itself). To build qubits that are perfectly coherent, we would have to control and understand all stray electric and magnetic fields, eliminate all vibrations, even isolate our computer from all light from the ultra-violet down well past the infrared. This is a level of complexity that has never been necessary and never even attempted in conventional computers. In fact, an important advantage of modern digital computers is that the bit being in 0 or 1 is all that matters. So if electrical noise or other random influences make the signal go to 1.1 or 0.9 .... it still counts as a 1. Equivalently, 0.1 or -0.1 are treated as 0. It is inherently robust to noise up to a certain level.

Quantum computers (at least with components that exist in labs right now) have no such inherent robustness. Each different type of quantum computer has different decoherence sources which come from the materials used to make the machine and the way in which it is designed. For the last 10-20 years or more, the designs of quantum computers and the materials used to build them have been evolving precisely to try and reduce the amount of decoherence. Great strides have been made, but the errors introduced by decoherence are still millions of times greater than error rates in conventional computers.

However, all (coherence) is not lost. Peter Shor showed that we can use the concept of measurement collapse in quantum mechanics to perform Quantum Error Correction. In short, the idea of quantum error correction is to use several (physical) qubits to represent the state of one “logical” qubit. This method of representing the state of the logical qubit (called an “encoding”) is performed in such a way that if the correct operations and measurements are performed on parts of this encoding, then the total system is collapsed into one of two states. Either no error, or one known error, which can be corrected. If this process is performed often enough, the effects of decoherence can be corrected. However, the specifics of when is “often enough” turns out to be one of the key critical issues in quantum computer design. This depends on how fast can you apply operations and measure your qubits, how many physical qubits are required for encoding one logical qubit, how fast you can apply the required corrections, and how strong was the decoherence in the first place.

Quantum control.
The remaining three DiVincenzo criteria can be loosely grouped under the heading “quantum control”. Control engineering is a vast and important field in the modern age, whether it is keeping people upright on a Segway, sending astronauts (or cosmonauts or taikonauts) into space, preventing your car skidding on a puddle or preventing a washing machine from destroying itself during the spin cycle. The ability to use small computers or electronic circuits to apply minor corrections to a machine to keep it operating correctly is one of the lesser appreciated but fundamentally important aspects of modern technology. A quantum computer will be no different. It will come with a myriad of electronic circuitry and computer systems to initialise each and every qubit at the start of a computation, to apply gate operations to actually perform the calculation and then to measure out the result of the calculation. (Although it should be said that due to the magic of quantum mechanics, its completely unclear if the calculation has actually been performed until after it has been measured!)

Although initialisation and measurement are generally understood for most major types of quantum computer designs, it is important to emphasis that these need to be performed exceptionally precisely (typically with something like a 99.9999% success rate). Similarly, there are sets of (quantum) logic gates which must be applied with similar precision. If one thinks of the usual computing gate operations (AND, OR, NOT, XOR, NAND, NOR etc.) in the quantum computing world all of these gates exist as well as more exotic examples like Hadamard gates, controlled-NOT gates, T-gates, S-gates, Toffoli gates and iSWAP gates. Although we now know quite a lot about how these gates work and how they need to be put together to perform interesting calculations, how to do it optimally is still very much open. Is it best to have the fastest gates so that we can beat decoherence? Should we use the gates that are easiest to implement so they can be done with greater precision? Do we need all the gates, or just a few of them but use them often? When trying to implement quantum error correction, do we just introduce more errors from all the gates we have to apply?. These questions all need to be answered, but the answer depends on which type of quantum computer you are building and how it performs, both on the drawing-board and in the lab.

Once all of these questions are settled, we have a type of quantum computer that we can scale up, that has long decoherence and that we control - we are done, right? Well, not quite. There are then software challenges. How does one perform quantum error correction in the most efficient way? How do we design the support hardware and software (the classical computer that will control the quantum computer)? How do we design our qubits so that when we make millions of them, they are all identical (or close enough to identical)?

For a long time, the way forward was unclear and everyone working in quantum computing had their own ideas about what a working quantum computer would look like. Now, things are settling and there are a few leading quantum computer designs. This new focus is not necessarily because we definitely know which way forward but because a few major ideas have progressed far enough that now we know that the principles are sound and it is worth pushing the designs as far as we can. The recent entry of commercial quantum computing efforts has also focused attention much more on the mundane engineering problems required to ultimately get quantum computers to work, without the additional scientific overhead of needing to publish, graduate or get tenure.

Ultimately, the quest to build a quantum computer may well prove to be one of humanities most impressive technological feats of the 21st century, simply because it requires such precise control over the materials it is built from and the software used to run it. It is engineering at a scale and at a level of precision that we could only dream of a few decades ago.

- Jared Cole (cole@h-bar.com.au), co-founder, h-bar quantum consultants

Quantum Schmantum in Australia: The surprising depth of quantum technology research Downunder

Australia is a relatively small country in terms of research culture and influence on the world stage. The idealised self-image of Australians is that we “punch above our weight” and achieve great things with scarce resources - a romantic ideal which dates from when we were an isolated outpost of British colonial expansion. It can certainly be argued that Australian scientific contributions compare favourably to anything being done in other parts of the world. However, statistically speaking, we are still small compared to the scientific powerhouses of the United States, United Kingdom, Germany, Japan and, in the last 20 years, China. Per capita we perform better but still lag behind the nordic countries. With a population of just over 24 million and an economy strongly reliant on primary industry (mining, agriculture etc.) the country’s scientific research tends to focus on “areas of critical mass”. Some areas of focus are understandable from a social and economic point of view (mining, agriculture, medical research). Others are more coincidental, for example astrophysics is particularly strong due to our Southern Hemisphere location and a strong history of support from the Commonwealth Scientific and Industrial Research Organisation (CSIRO). Therefore when looking from an outside perspective, it may seem surprising that a major strength in Australian physics research is Quantum Technology.

To understand what I mean by strength, lets discuss the quantum technology research landscape in Australia, in 2017. The Australian Research Council (ARC) Centres of Excellence programme is considered the premiere funding vehicle for fundamental and applied research. This programme focuses on groups of 10-20 lead investigators who typically already have a tenured position within an Australian university. A position within a successful Centre of Excellence is hotly contested as it typically funds postdoctoral researchers, equipment, graduate student places, travel etc. for each of the lead researchers (or “Chief Investigators”). The focus is on big goals, collaborative and interdisciplinary research and a unified research effort, beyond the usual 1-5 person research teams funded through the “Discovery" programme - the standard ARC grant. The time period over which a Centre of Excellence is funded (7 years, with possible renewal) is also more than twice as long as a Discovery grant. More than anything else, a Centre of Excellence (CoE) gives stability to a scientist’s research.

What is surprising is how many of these Centres of Excellence are currently funded (or have been funded in the past), which have a Quantum Technology aspect. The CoE for Quantum Computation and Communication Technology (CQC2T) is obviously both the most visible and best funded of these Centres. It has existed in a similar form since 1999 and in fact predates the Centre of Excellence scheme. As well as obtaining the highest level of ARC funding, it has additional government, industry and military funding - over $10 million AUD per year at last count. The vast majority of this investment is focused on the singular goal of designing and building a silicon based quantum computer. Given the collaborative nature of a CoE, this has resulted in an exceptionally high level of output in all areas of quantum computing that the Centre focuses on, both theory and experiment.

Although CQC2T gains most of the attention, there is an impressive depth of quantum technology research in other CoEs. The CoE for Engineered Quantum Systems (EQUS) includes several lead investigators that are CQC2T alumni. However EQUS is focused on quantum technology more broadly. This includes quantum measurement, control, sensing and simulation. In short everything except quantum computing specifically. 

There are also a series of other CoEs with significant quantum physics research focused on technology and applications, but do not specifically badge themselves as quantum technology centres. These include:

  • the Centre for Ultrahigh Bandwidth devices for optical systems (CUDOS) which focuses on photonic engineering and optical devices for communication and other technology applications.
  • the Centre for Nanoscale BioPhotonics (CNBP) which researches biomedical imaging applications and the control of light at the single photon level for medical imaging, diagnosis, and single cell manipulation.
  • the Centre for Future Low-energy Electronics Technologies (FLEET) focusing on low-energy electronics using novel materials include two-dimensional films and topological insulators.
  • the Centre for Exciton Science (ACEx) researching the generation, manipulation and control of excitons in molecular and nanoscale materials for solar energy harvesting, lighting and security applications.

You may notice two things immediately from that list. One, it is necessary to have an acronym for your Centre - the more memorable the better. Two, you notice that the focus and selling point of these Centres is far from quantum computing and quantum technology in general. Yet, a closer look at the investigator list for each of these Centres will find many examples of former Centre for Quantum Computation members. 

Dig a little deeper and in the ARC fellowships for early-career, mid-career and senior researchers (DECRAFuture, Laureate Fellowships respectively) you will also find many examples of quantum technology research - often also Centre for Quantum Computation alumni (or other closely related groups). In the most recent round, notable examples include Dr. Marcus Doherty (ANU), Dr. Gerardo Paz-Silva (Griffith), Dr. Lachlan Rogers (Macquarie), Dr Christopher Ferrie (USyd), Dr. Fabio Costa (UQ), Dr. Peter Rohde (UTS), Prof. Andrew Greentree (RMIT). This again reflects the great strength of quantum technology research in Australia.

The fact that such a strong quantum technology research focus appears in many different guises is very much a result of the way the CoE programme functions and how physics research in Australia evolves to fit the funding model imposed upon it. Each lead investigator has their own interests and focus, but where these interests best fit in the CoE scheme varies as a function of time and as a function of the CoE groupings. We see young researchers who “grew up” in one Centre move on with their research interests, eventually rejoining or forming a new cluster that starts to accrete researchers until sufficient critical mass is achieved to become a funded CoE. This itself is not so surprising for such a collaborative, long term scheme. What is unusual is the large number of Australian investigators that currently could be referred to as working in the quantum technology space, yet they are not part of the two big quantum technology based Centres.

Beyond ARC funded schemes, there are other examples of large scale investment in research in the quantum computing space in Australia. Microsoft have for quite some time had a strong presence in quantum information and computing theory via their StationQ research team. Recently this effort has stepped up a gear and moved strongly into experimental realisations of quantum computing, incorporating Prof. David Reilly's lab at the University of Sydney (who is also a member of EQUS). Just down the road, the University of Technology Sydney has formed the UTS Centre for Quantum Software and Information using a combination of UTS and ARC funding. Although these efforts are still technically University based, it is indicative of the worldwide pivot towards commercialisation of quantum computing technology - by the university, government and private sectors.

The reason for this strong focus on quantum physics and quantum technology in Australia is due to a range of factors including historical precedent, governmental policy and playing to the Australian psyche. Since at least the 1980s, Australian and New Zealand have an exceptionally strong representation in the field of quantum optics. A standard collection of textbooks on quantum optics includes the names of many antipodien authors such as Walls, Gardiner, Carmichael, Bachor, Milburn and Wiseman. This is partly the influence of the great Dan Walls on New Zealand physics, and by extension Australia. However, it is also an artefact of a time when the fields of particle physics and condensed matter were dominated by the USA and USSR. Quantum optics was a “cheap and cheerful” science where real progress could be made with the limited resources available south of the equator.

With the advent of quantum computing in the mid 90s, the tools used in quantum optics were perfectly suited to this new and exciting field. For the first time in many decades, brand new concepts and results in quantum physics were appearing monthly, sometimes weekly. For the quantum optics specialists of New Zealand and Australia and their students, it was an easy jump into this new field. Twenty years later, it is no coincidence that we have an entire generation of established physicists with a sound knowledge of quantum technology. 

Add to this strong quantum technology research environment, several quirks of the Australian system. First, in Australia PhD students are essentially “free”. They are paid by government scholarships which cover both their fees and a stipend, and therefore don’t cost the doctoral supervisor’s grants anything other than conference travel or computer resources. The result of this funding arrangement is that the secret to getting high quality PhD students is not necessarily to have large grants, but to have interesting projects and a stimulating research culture - something that quantum computing and technology has had right from the start. Secondly, due to high cost of living and good working conditions, Australian postdoctoral positions are well paid and therefore expensive. This means that once a student completes their PhD, the number of local positions is very limited and going overseas for more experience is necessary if one wants to make a career as a physicist. The result is that many labs around the world have an Australian working in quantum technology. Even burgeoning commercial quantum computing efforts such as Google and IBM have key members who learned their trade in the Centre for Quantum Computation during its formative years.

These quirks have resulted in an effective system for training specialists in quantum technology and spreading them throughout the world. However, there are two more ingredients which have contributed to the exceptionally strong focus of Australian quantum physics research. One is that the Australian diaspora, by and large, are still trying to come home. A strong sense of national identity and in general excellent living conditions (and weather) make Australia an attractive proposition, even for those who weren’t born here. It is an effect also seen in Australian actors and business leaders. Even after spending many decades in either Europe or North America, they will often take a position back in Australia at some time before retirement. This means that academic positions at Australian Universities are increasingly hard fought rarities which attract a raft of exceptional candidates. Each newly formed Centre of Excellence or collaborative research group has no space for weak members. 

Of course, the return of highly trained expats applies to all branches of science and academia in general. What seems to be different about physics and quantum technology in Australia is that physicists are adaptable. Sitting somewhere between the intellectual safety-harness of formal logic in mathematics, and the application driven focus of engineering, chemistry and biology - physics in the 21st century is often about being able to tell a good story to explain your work’s significance. As this has become paramount to obtaining acceptance from our peers, it is a relatively straight-forward step to apply this to convincing grant agencies of the important of the research.

In addition, the last decade or so have seen an almost blind faith in publication metrics. Job applications include total citations, h-index and lists of high impact journal publications as a matter of course. The short-listing of job applicants by HR departments and Research & Innovation offices has removed the subtlety of judging research potential. Now, sheer numbers of high impact journal papers which gain many citations is the key to the elusive tenured position. This is a game for which quantum technology is perfectly suited. New tools, new applications and new concepts appear all the time. A junior researcher can make a name for herself with just a couple of key results that spark a new flurry of activity in the research community. Contrast this with the slow and steady incremental work in many other branches of physics and it is little wonder that since the turn of the century quantum technology research has had such a grip on physics.

This of course brings us to pontificating about the future. Can this expansion continue? Well, in terms of quantum computing, in 2017 we really are at the pointy end of the business. Quantum computing is now a research reality in commercially funded labs. It is just a matter of time before enough qubits are wired together to perform a calculation that cannot be simulated classically, even in its simplest form. Quantum cryptographic systems can be purchased from several companies worldwide. Quantum metrology and sensing is becoming more mainstream in the scientific community and will eventually cross over to become mundane in the commercial sector as well. However, the pace of discovery in academia is slowing. The problems are harder, the progress is more incremental. Having said that, the foundation of quantum physics knowledge that has been built in Australia will not disappear any time soon. Physicists are adaptable, always looking for unsolved problems to hit with shiny new hammers. Whether it is new problems or new tools, the career incentives continue to favour those who find them. The question is simply can the quantum technology community focus its energy on problems of enough significance to mankind to continue justifying tax-payer funding. Finding things to do is never difficult for an academic, finding worthwhile things to do is the challenge.

- Jared Cole, co-founder, h-bar quantum consultants

Postscript: Please email me if you believe I have left out a significant quantum technology research effort within Australia. Also, special thanks to A/Prof. Tom Stace for providing the inspiration for the title of this article.

Full disclosure: A/Prof. Jared Cole is currently a chief investigator within ACEx and an associate investigator within FLEET. His PhD was in quantum computing within the CoE for Quantum Computation Technology (the precursor to CQC2T) from 2003-2006.

 

 

h-bar joins the Quantum world association as a founding member for Asia.

Today sees the official launch of the Quantum world association in which h-bar joins as a founding member for Asia.  Below is the full text of the official press release. 

- Simon J. Devitt, co-founder, h-bar: Quantum Consultants

PRESS RELEASE

Official presentation of the Quantum World Association

at the Mobile World Congress

Leading global quantum companies found the first worldwide quantum association.

 

Mobile World Congress, Barcelona March 2nd 2017

 

The impact of first generation quantum technologies in our everyday life is a reality in several areas such as lasers, magnetic resonance imaging, GPS.

The second quantum revolution is already underway.  Governments and companies worldwide are investing substantially to unleash the power of quantum technologies. The first quantum communication satellite was launched last year, several quantum devices for Cybersecurity, quantum sensing and metrology, quantum simulation and other applications are already operational. 

Key government and commercial projects, at hardware and software level, are under development.

With the vision to bring together players in the field of quantum technologies and services, a group of leading companies has decided to found the Quantum World Association, a not-for-profit organization based in Barcelona (Spain).  

The vision of the Quantum World Association is to connect researchers, universities, companies and institutions to develop a quantum ecosystem and further promote quantum technologies. 

“Following the great acceptance of the quantum ThinkTank Barcelonaqbit, six months ago we decided to do a further step and to create an Association” said Alfonso Rubio-Manzanares, CEO of Entanglement Partners (Spain) and co-founder of the Quantum World Association.

The association was officially presented on March 2nd at the Mobile World Congress.

The Association has the objective to connect quantum industry and scientific leaders to create common standards, to understand business insights and to be a knowledge center for the industry” said Giorgio Maritan, Managing Director of Quantum World Association.

“Cybersecurity is one area where organizations are already preparing today for the quantum era, which is driving the growth of a new industry that brings both conventional and quantum technologies designed to be safe in an era with quantum computers” said Michele Mosca, CEO and Co-founder of EvolutionQ Inc.  (Canada).

“During the Mobile World Congress, we were able to demonstrate some of our quantum-safe security solutions, including our existing quantum random number generator and Quantum Key Distribution solutions. The cyber security community must integrate the risk of quantum computing into its strategy and protect data for the long-term future. Our common challenge is to help governments and enterprises to get ready in a timely manner,” said Dr Grégoire Ribordy, CEO of ID Quantique (Switzerland). 

The Quantum World Association confirmed that it is structuring 3 chapters that will help the international development.

The chapters are the following:

  • Asia. Led by H-Bar (Australia) 
  • Americas. Led by EvolutionQ (Canada)
  • Europe. Led by IDQuantique (Switzerland) and Entanglement Partners (Spain)

“The Quantum World Association is a major milestone in the field of quantum technology.  It illustrates to the world that we are ready to move out of the physics laboratory and into the industrial and commercial space.  H-bar is privileged to be a founding member and we anticipate a new revolution in the information technology sector firmly grounded in quantum technology” said Simon Devitt Co-founder of H-Bar (Australia)

"Our vision is to become the epicenter of Quantum knowledge empowering companies and other institutions to lead the future of quantum industries.

We are proud of the commitment of our founding members and we encourage companies to be involved in our association. Let's build the future together" said Oscar Sala, Chairman of the Board of Quantum World Association

Company Information:

About ID Quantique

Founded in 2001 as a spin-off of the Group of Applied Physics of the University of Geneva, ID Quantique is the world leader in quantum-safe crypto solutions, designed to protect data for the future. The company provides quantum-safe network encryption, secure quantum key generation and Quantum Key Distribution solutions and services to the financial industry, enterprises and government organizations globally.  IDQ’s Quantum Random Number Generator has been validated according to global standards and independent agencies, and 

is the reference in highly regulated and mission critical industries - such as security, encryption and online gaming - where trust is paramount.  

IDQ’s products are used by government, enterprise and academic customers in more than 60 countries and on every continent. As a privately held Swiss company focused on sustainable growth, IDQ is proud of its independence and neutrality, and believes in establishing long-term and trusted relationships with its customers and partners. 

For more information, please visit http://www.idquantique.com

About evolutionQ Inc:

Powerful new quantum technologies promise tremendous benefits, but also pose serious threats to cybersecurity.  

evolutionQ is the first company worldwide dedicated to offering the services and products organizations need to manage their quantum risk and to deploy cyber tools designed to be

safe against quantum computers in a timely and cost-effective manner.

evolutionQ was founded and is led by global leaders in quantum-safe cybersecurity credited with:

  • Leading fundamental research underpinning quantum-safe cybersecurity
  • Co-founding the Institute for Quantum Computing
  • Initiating and driving global standardization efforts
  • Teaching and training the quantum-safe workforce
  • Transferring knowledge and technology to industry and government for over two decades.

With a team of individuals with decades of experience bringing new cryptographic tools into widespread application, evolutionQ can evolve your organization to a quantum-safe position.  

For more information, please visit http://www.evolutionq.com/

About H-Bar

Founded in 2016 (Melbourne, Australia) , H-Bar is the first expert driven consultancy firm in the emerging field of quantum technology.  We are on the cusp of a second revolution in information processing and electronics, with devices being built to actively exploit the strange and often counterintuitive rules of quantum mechanics.  We are here to help and guide your understanding of this exciting new field and identify new and lucrative investment opportunities in this world changing class of technology.  

Our team consists of world recognised experts in the fields of quantum computing and communications systems, solid state, condensed matter and optical physics, quantum software engineering and experimental fabrication and design of active quantum technology.  Founded by Dr. Simon Devitt (Riken, Japan and Macquarie University, Sydney), Dr. Jared Cole (RMIT University, Melbourne) and recently joined by our new partner Professor Keith Schwab (Caltech) we offer consolation services to bring together the scientific expertise in quantum technology with the commercial and industrial sector to help create an entirely new industry focused around quantum information and quantum technology. 

For more information, please visit http://www.h-bar.com.au/

About Entanglement Partners

Entanglement Partners is the first consulting company in Spain and Latin America whose business is focused in quantum technologies: Quantum Computing, Telecommunications, CyberseQurity, Simulation and Algorithms.

Entanglement Partners has been founded by a multidisciplinary team of business executives and internationally recognized scientific professionals who are specialized in quantum technologies.

The company develops its activity mainly in three areas: strategic technology consulting, distribution and deployment of quantum products, design of projects related to quantum telecommunications and technology.

Founded in 2016, it is headquartered in Barcelona with offices in Madrid (Spain), San José (California USA) and Kerala (India).

For more information, please visit http://www.entanglementpartners.com/

 

Professor Keith Schwab joins h-bar

Prof. Keith Schwab, from www.kschwabresearch.com

Prof. Keith Schwab, from www.kschwabresearch.com

We are delighted to announce that h-bar has a new partner.  Professor of Applied Physics, Keith Schwab of Caltech will be joining h-bar as a member of our consultancy team. Professor Schwab has a long and distinguished career in applied physics and quantum technology and his expertise will be invaluable to h-bar and our clients.

Professor Schwab received a BA in physics in 1990 from the University of Chicago and worked with Prof. Dick Packard at the University of California, Berkeley during his Ph.D, which was awarded in 1996.  After his Ph.D, Professor schwab spent the next four years at Caltech as the Sherman Fairchild Distinguished Post Doctral Scholar.  

In 2000, Prof. Schwab joined the National Security Agency forming one of the first research groups investigating the applications of active quantum effects in technological devices, working on quantum metrology, quantum nano-mechanics and superconducting qubits.  

After he left the NSA in 2006, Prof. Schwab joined the faculty of Cornell University and in 2009 accepted the position of Associate Professor of applied physics at Caltech, where he is now currently a full professor of applied physics. 

Prof. Schwab is considered one of the worlds best experimental quantum physicists and one of the pioneers of quantum technology.  He has published numerous experimental papers in the most prestigious international journals, has given over 50 invited talks at international conferences and workshops and has extensive experience in academia, government and industry.

We are delighted for Prof. Schwab to be joining h-bar and expect his knowledge and expertise to help us deliver exceptional service and advice to our clients. 

For more information, please visit Prof. Schwabs website.

- Simon Devitt and Jared Cole, Founders of h-bar: Quantum Consultants

Blueprint for an ion-trap quantum computer

Science Advances, Vol. 3, no. 2, e1601540 (2017)

Today in the journal Science Advances, researchers from the ion trapping group of the University of Sussex in the U.K. Aarhus University in Denmark, Siegen University in Germany, Google Inc and Riken in Japan have proposed a fundamentally new architecture for an ion-trap quantum computer.  I was a part of this research and am very excited to work on a method for ion-trap quantum computing that can form the basis of a large-scale machine.  

Ion-trap quantum computers have been one of the leading technologies for large-scale quantum computing.  The underlying technology is very mature and was developed initially to be used as very accurate atomic clocks.  When quantum computing was initially developed in the 1980's and 1990's, ion traps were one of the first technologies to experimentally demonstrate individual quantum bits (qubits) and since then, technology development has been pronounced.  

In an ion-trap quantum computer, individual qubits are ionised atoms. Some systems use Calcium, some use Beryllium and some use Ytterbium.  As the atom is ionised (i.e. carries a net positive charge) it can be trapped by an electromagnetic field, holding it in place.  The ion qubit is then held in an electromagnetic field inside a ultra-high-vacuum container.  This vacuum is required to make sure that the ion is not knocked out of the trap due to collisions with other atoms flying around inside the system.  The qubit itself is defined by the quantum state of a single electron of the ion.  Two stable electronic states are chosen to represent the binary zero and one states and these states can be manipulated via lasers or by manipulating the magnetic field environment of the ion.  

Manipulation of a single ion qubit is now routine in laboratories around the world.  Injecting and trapping an ion, performing single qubit quantum gates and reading out individual qubits can be done with extremely low error rates, in multiple systems, and many small-scale tests and protocols have been demonstrated over the past decade and a half.  

Operations on multiple qubits are also possible through coupling ions through motional degrees of freedom between two (or more) trapped ions.  Because individual ions are positively charged, if they are placed in the same trap, they will experience a mutual repulsion due to their respective positive charges. This mutual repulsion changes slightly when the electronic configuration changes between each individual ion and hence can be used to enact quantum logic gates between two qubits.  Again, through careful control of the system, experimentalists have enacted logic operations between qubits and realised small-scale programmable ion-trap quantum computers.  

The question that physicists and engineers are now addressing is scalability, namely how do we increase the number of qubits in the system to enact complex and required error correction protocols and scale the system to sufficient size to perform quantum algorithms that cannot be realised on even the most powerful classical supercomputers?

An ion-trap X-junction, the building block of an ion trap quantum computer.  The gold coloured base plate consists of a series of electrodes that are used to manipulate the electromagnetic field used to trap individual ions.  This allows us to trap ions in separate regions of the machine to "load" ions (injecting qubits into the computer), measure the quantum state of ions and to entangle ions together (performing gate operations between two ions)

Scaling ion-trap computers to the level of millions (if not billions) of qubits requires very careful design.  Luckily, ion-trap computers have a rather unique property: qubits can be moved (shuttled) around, they are not fixed in place.  By manipulating electromagnetic fields that are used to trap individual ions, they can be moved and shuttled around the computer.  This allows us to trap ions separately and move them around to inject or "load" them into the computer, measure them in dedicated readout zones and to entangle them with other ions in the computer, fast and with very low error rate. 

X-junctions are fabricated together in a grid.  Each X-junction contains a single ion qubit that can be initialised, interacted with its four neighbours to the north, east, south and west and measured.  Repeating this structure allows for an arbitrarily large error-corrected quantum computer, capable of implementing any algorithm.

Even with the very low error rates that experimentalists can achieve with ion-trap technology, they are still not good enough for large-scale algorithms such as Shor's factoring algorithm or Grover's search algorithm.  Active error correction codes are still needed.  The ion-trap architecture is consequently designed around a class of topological error correction codes, known as surface codes.  Surface codes are a desirable method for large-scale, error-corrected quantum computers as they are amenable to system design and have very good performance.  Surface codes only require error rates for each physical operation in our computer to be below approximately 1% before they begin working effectively.  Error rates at 1% or lower are already experimentally achievable in ion-trap systems. 

In other designs for ion-trap computers, physicists have imagined building small mini-computers, each containing anywhere between 10-100 physical ion qubits.  These mini-computers would then be linked together with photons and optical fiber.  This would allow scale-up by connecting together separate and comparatively small ion-traps to form a larger computer.  unfortunately, the downside to this approach is that establishing an optical connection between separated ion-traps is both very slow and very noisy, two things that are detrimental to a functional and useful quantum computer.

In our approach, we decided that a monolithic design for an ion trap is better.  The X-junction shown above allows an individual ion to interact with its four neighbours, hence to scale the computer to arbitrary size, we just physically connect may X-junctions together and shuttle ion qubits between X-junctions to perform gates.

A module is a 36x36 array of X-junctions fabricated with necessary control electronics and mounted on a steel frame with piezo-actuators that allow for aligning modules together.  Each module houses 36 qubits in our quantum computer.

We define a module that consists of an array of 36x36 X-junctions, each junction containing a single qubit in our quantum computer.  This module contains all the control structures necessary to manipulate the qubits in the ion-trap.  Below the surface of the trap (where each individual qubit hovers about 100 micro-meters above the electrodes) there are layers of electronic control and cooling.  Finally, the module is fabricated to a set of piezo-actuators and then fabricated to a support frame.  The piezo-actuators are used such that two modules can be aligned together and ions transported across the junction between two modules.  Our analysis showed that provided each module was aligned to less than 10 micro-meters in either the x,y or z direction, we could still reliably shuttle ions between modules.

If this module can be built, scaling the quantum computer to arbitrary size simply requires fabricating more and more modules and connecting them together.  In this way, the ion-trap quantum computer can operate as fast as possible with very low error rates and does not require us to build and integrate in additional quantum technology such as photonic interconnects which have so far proven to be difficult to build reliably, with good performance.    

By connecting modules together we scan scale the computer to arbitrary size.  Shown is several connected vacuum systems containing approximately 2.2 million X-junctions.  This system would occupy the space of a mid-sized office and be able to run fully error corrected quantum algorithms.  The entire computer is housed in a ultra-high vacuum, to eliminate any stray atoms that could collide with ion qubits.

Scaling an ion-trap quantum computer will require some very high quality engineering.  Each module contains enough X-junctions to accomodate 36 ion qubits and occupies a physical space of 90mm x 90mm, this is a comparatively large footprint for a quantum computer.  We can envisage a much larger system, as illustrated, which contains 2.2 million X-junctions in a series of connected vacuum chambers (hence 2.2 million qubits).  The size of each chamber is 4.5m x 4.5m, about the size of a mid-sized office.  Additionally, the entire quantum computer must maintain an ultra-high vacuum inside for the length of time necessary to run a quantum algorithm (which may be anywhere from seconds to weeks).  

While the engineering challenges are significant, they are not impossible and much of the research in the ion-trap community is focused on these issues.  One significant adaptation that we made in this architecture is the elimination of a significant amount of laser control.  In more traditional ion-trap quantum computers, every operation on ion qubits (except for shuttling) is mediated by precisely focused laser beams.  For a system containing millions of qubits, the amount of laser control would be significant and potentially very costly to the design of a large-scale machine.  

We remove costly and difficult laser control for each ion qubit by a microwave pulse that is broadcast over the entire computer.  Ions that we wish to address with the pulse are "tuned in" via manipulating the local magnetic field environment with control wires embedded under the surface of the ion trap.

In 2016, the ion-trap group at Sussex University (who lead the work on this paper) demonstrated a new technique to control and manipulate ion qubits.  Instead of using tightly focused laser beams, the group use a microwave pulse that was broadcast over the entire ion-trap.  The ions that they wanted to react to this microwave pulse were "tuned in" via precise control of the magnetic field environment around a particular ion.  In this way you could use one microwave pulse to enact operations on large numbers of qubits simultaneously by tuning in the relevant qubits by changing local magnetic fields.  This eliminates the need to have selective laser control for every ion qubit in the machine.  Controlling the local magnetic field to each X-junction is performed with wires embedded underneath the surface of the ion-trap.  By controlling electrical current through these wires, we can alter the magnetic field near a particular ion and "tune them in" to global microwave control pulses applied over the entire computer. 

We believe that this model of an ion-trap quantum computer may be significantly easier to engineer and ultimately build than other designs.  Many of the components of this monolithic design have already been demonstrated experimentally and much of the challenge left is to put all these pieces together and to slowly scale the system to first 10's of qubits then to 100's, 1000's and hopefully millions in the not too distant future. 

The future of ion-trap quantum computing looks very bright and this technology is a direct competitor to superconducting quantum computing designs pioneered by places like IBM and Google.  Both technologies maturing at the same time gives us tremendous flexibility in how we adapt quantum computing technology to specific commercial tasks in this new and exciting technology sector.

- Simon Devitt, co-founder of h-bar quantum consultants.

 

Quantum technologies and the launch of h-bar quantum consultants

It is with great pleasure that I am writing the first blog post for the official launch of h-bar quantum consultants. h-bar aims to provide professional advice services to the burgeoning quantum technology industry. Liaising between academia, government and business to provide detailed and up-to-date advice on new technology with a very steep learning curve.

The translation of quantum technology from the laboratory to commercial devices promises to be one of the great challenges of 21st century science and engineering. The United Kingdom’s National Quantum Technologies Programme and the recent announcement by the European Commission of a €1 billion Quantum Technologies Flagship are both targeted specifically at developing commercial quantum technologies. Worldwide we also have large scale investment in quantum technology research by government agencies in the United States, Canada, China, Japan, South Korea and Australia. Despite all this public investment in quantum technology commercialisation, for many applications we are still at the very start of the long road from research and development to market. It is a very exciting time to be in the field.

So what is quantum technology? Quantum physics is often referred to as “modern” physics, yet the principle of quantised energy which underlies quantum mechanics was first discussed more than 100 years ago. Throughout the 20th century a range of new technologies have been developed which in some way rely on this fundamental understanding of the universe. The operation of transistors, LEDs, MRI machines, lasers and many more are understood using the principles of quantum mechanics. However, recently these technologies are increasingly referred to as “first generation” quantum technologies. 

This of course begs the question, what is a “second generation” quantum technology? An example of a useful definition is given by Georgescu and Nori - technologies harnessing quantum superposition, uncertainty or quantum correlations are “second-generation” quantum technologies. However, this is quite a technical definition which does’t help non-specialists understand what such a distinction means and why is it important to differentiate at all?

For me, a simpler definition is that second generation quantum technologies are those which require (or benefit from) control over the quantum mechanical wavefunction of a system. The wavefunction is a central concept from quantum theory. It provides a mathematical description of the state of a system, i.e. what is a quantum mechanical system doing right now? However, many of the counter-intuitive results of quantum mechanics that can be confusing at first sight come from the fact that measuring the wavefunction directly is particularly difficult. Rather we infer the value of the wavefunction from the probabilities of measurement outcomes. 

The reason that control (or lack thereof) of the wavefunction provides a good definition is that most first generation technologies can be understood using a mathematical theory based on the probabilities only. This is also why they have been quickly incorporated into existing technologies over the last 50 years. It is only in the last 10-20 years that we have developed the technology to control the wavefunction itself. With this enhanced control we have discovered a raft of new applications including quantum cryptography, quantum computing, quantum metrology and quantum sensing. These technologies promise to allow us to hide our data more completely, solve tough mathematical problems more efficiently and sense the world around us with higher precision than ever before. However, we are still just at the very beginning.

The late 19th century developments in electromagnetism lead to large-scale technology applications in radio and electronic engineering after the first world war. The discovery and harnessing of nuclear physics during the second world war lead to the field of nuclear engineering following declassification of the field in the 1950s and 60s. We are only now starting to see the first generation of “quantum engineers”.

So where does h-bar fit in with all of this? As quantum technology becomes more of a commercial reality, it will be essential to have good information flow between scientists, engineers, business and government. Here at h-bar, we provide this service, linking stake holders and helping translate between the very different wants and needs of the fledgling quantum technology industry. We provide frank and impartial advice on all aspects of quantum technologies. Having played our part in the development of these technologies, now we aim to shepherd them through to full commercial applications.

- Jared Cole, co-founder, h-bar quantum consultants