Episode 4 of Talking Engineering: An Interview with Michael Tusch, October 2020

Michael started his career as a researcher in semiconductor quantum theory at Oxford, before moving into industry, first with the Boston Consulting Group and later holding several management positions before founding Apical in 2002.  

Apical is a Cambridge-based technology company specialising in image and video-processing technology. It is one of the UK’s fastest growing tech companies and their technology is probably in your pocket at this very moment. Apical specialises in developing truly cutting-edge, next generation camera and display systems, which incorporate knowledge of the human visual processing system. In 2016, ARM acquired Apical for $350 million.  


JH: Michael, what drew you towards studying Physics and semiconductor quantum theory in particular? 

MTI actually ended up going to university originally to study chemistry, and over time, I found that the one part of that subject that most interested me was the theoretical aspect, essentially the quantum theory that underlies all reactions. Chemistry has its roots in quantum mechanics, for example the periodic table comes from solving the Schrödinger equation for the Hydrogen atom. I didn’t really have any aptitude for the practical aspect of Chemistry and it was my deep interest in the theoretical side of the subject that led me towards theoretical physics.  

At the start of my DPhil, I decided to go into condensed matter theory, which is all about how electrons behave in solids and liquids, and how this determines the properties of the material, whether it is a magnet, or an insulator, a semiconductor or even a superconductor. I found this absolutely fascinating, and so I suppose the reason I decided to go into this field of study was purely down to interest, as opposed to knowing what I wanted to do with my career.  

JH: In 2002 you founded Apical. Could you describe briefly how you got to the point of founding the company from the time you finished your research 

MTAfter my post-doctorate research, I decided I wanted to explore life outside the university world. Not having much of an idea what I wanted to do, I originally decided to become a management consultant, to leave behind all of my scientific work. Once I had explored a number of areas within business and commerce, I quickly found that the area which most interested me was the technology sector.  

I left the consultancy firm because I decided I didn’t really have a career there. I worked with my father for a while – he had a manufacturing business – and this gave me an insight into aspects of what it was like to run a business, from dealing with suppliers to running a profit and loss account. This allowed me to realise that I had the basic tools I needed to start up my own company. I looked around for quite a while for an idea for a technology start-up that I could build and take forward. You see, when founding a start-up, unless you have a very strong due diligence at the start, it’s very hard to get anywhere. I was involved in a research project which led indirectly to the technology which is behind Apical. In the early stages, I worked on modelling the way the eye works. The realisation that we could take this information and turn it into a digital technology that could be used in things like cameras was what prompted me to incorporate Apical. 

So that’s how I started Apical. In summary, it had all the characteristics of something that I could form a start-up from, and in addition, I foresaw that there would be a huge market a few years down the line.  

JH: You spoke about how you gained inspiration from the human visual system in the early stages of your business. How exactly are Apical’s products modelled on the human visual system? 

MT: If we go back to the origins of the technology, we were looking at how cameras work and comparing them to the way the human eye works. There’s a certain area of science called biomorphic systems, which is where engineering tries to gain inspiration from nature. After all, evolution has been taking place for a couple of billion years and it has come up with some very good solutions! The biggest difference between a camera and the human eye is not the lens, nor the artificial retina in a camera (the image sensor). These are actually quite similar in terms of how they work in an eye and in a camera. The big difference has to do with how the eye and the camera process the information they receive.  

You see, when the eye picks up light, it converts this energy to electrical signals in the retina which get sent to the brain. However, right before the signals are ‘sent off’, the eye actually carries out some computation on these signals. Effectively, there is a neural network in the retina which transforms the electrical signals before they go to the brain and this allows our eyes to be extremely adaptive. One specific example of this is how well our eyes adapt to changes in brightness. On a sunny day, it will be at least a thousand times brighter outside than inside. Yet, when you go outside, it may seem a little brighter, but not by a factor of a thousand. Therefore, if you want to a camera to take realistic pictures at different levels of brightness, you have to implement a similar form of neural networking. 

Our first product was a little core that you could implement into any imaging device to enable that device to adapt to different conditions in a similar way to the human eye. Nowadays, pretty much all the phone cameras in the world have something based on our technology in order to be able to produce good-looking pictures.   

JH: Since then, how have you diversified the products that Apical manufactures? 

MTThe way a modern smartphone is produced involves a long supply chain consisting of very many different companies. Apical’s role in this supply chain is to produce semiconductor IP. When a chip is produced, it is designed in a big CAD system somewhere, and then it is sent to a factory (probably in Taiwan, which has the most advanced factory of this sort in the world), where it can be ‘printed out’. Approximately 1 or 2 square millimetres of this chip are designed by Apical. 

What we started off by doing was producing a tiny core, which could be injected into this chip, and could transform the camera from a ‘dumb camera’ to one which saw like the human eye. As the world changed, and phones started to come along, we diversified by expanding the tiny core and ultimately taking over the whole camera systems of certain devices. Essentially, we went up the food chain by increasing the percentage of the final product designed by us. 

Another way we diversified was by applying our technology to other areas. For example, we adapted our technology to displays. Nowadays, if you use your mobile phone outdoors, you can see it fairly well despite the brightness of your surroundings, and this is partially due to our technology.  

Around 7 or 8 years ago, we started to realise that in the future, the ability of computers to take, process, and look at images would be very important. We asked ourselves how we would have to redesign our systems so that computers could use the imagery, and we concluded that we could use AI to interpret pixel data and described what is in an image. This is the technology that our latest products have revolved around.  

JH: What other areas of technology excite you the most? 

MTSince I’m no longer full-time CEO of Apical, I now have more of an opportunity to explore other things. I’m still very much interested in embedded systems, and particularly their medical applications. Technologies that used to be incredibly expensive have now become more specialised and much cheaper, for example image sensors, thermal cameras and radars. I’m interested in finding the applications of these to health and well-being, in order to answer the questions of how we can keep people healthy for longer and how we can detect illnesses earlier.  

JH: With all of your experience, would you agree with the description of engineering as ‘applied science’? 

MTYes. I think that’s true to some extent. Any type of technology is based on scientific discoveries that have taken place many, many years earlier. For example, the exciting surge of technology we’ve had over the past 20 years – mainly involving smartphones – has essentially involved incremental improvements on discoveries that were made decades earlier involving people such as Marconi with radios, and all those who pioneered early computers in the 50s. So although everything seems to be very new, the seeds of what we are seeing nowadays were actually sown many years ago.  

Science definitely underpins everything that goes on in the Engineering world. However, this relationship can go the other way. There are many people whom we would call engineers who are opening up new areas of science.  

An example of this is the constant decrease in the size of transistors on microchips, as observed by Moore’s Law. Historically, the performance of computer chips has doubled every two years. The smaller transistors that can be manufactured, the more of them can be fitted onto a chip, and the higher the performance of the chip. Thus, with new developments in science technology, engineers have been able to consistently shrink the size of these transistors year by year.  

However, now we have got to the point where transistors have become so small that quantum mechanics gets in the way. When you have transistors that are only a few atoms in size, you can’t pretend that they behave predictably anymore because they are in a quantum domain. So now, as companies such as TSMC, Intel and Samsung try to push this chip technology further, engineers find themselves having to do fundamental and extremely sophisticated science research.  


A note from the writer: I really hope you enjoyed this interview. This is the fourth instalment in the ‘Talking Engineering’ series. Here you can find the rest of the series and watch this space for yet more exciting discussions with engineers.

– Jasper Hersov

About the author