Source:Wired.com

We can still vividly remember how not so smooth it was for Doug Burger, a researcher in computer chip production trying to convince the CEO of Microsoft, Steve Baller, in December of 2012, to give his consent for them to move into a project called Catapult that will position the company for the competitive future that was coming.

It took another voice in the presentation room, that of Qi Lu, whose duty was to oversee Bing; the company’s search engine, to convince Baller. The aim of Project Catapult is to provide millions of the company’s servers with chips that are specialized which can be reprogrammed for any task as the company wishes.

Finally the consent was given and today we have the programmable chips made possible by Burger and Lu’s conviction. Today they are known as Field Programmable Gate Arrays (FPGAs). Presently the FPGAs can be seen underpin Bing, and will in few weeks to come be the driver for networks that are neural deep using the search algorithms that will make use of artificial intelligence that is structured how human brain works. This will make it possible for the AI to carry out tasks much faster than could be achieved with ordinary chips.

The result is that the task will be carried out in just 23 milliseconds as against the usual 4 seconds that produces nothing on your screen. The FPGAs will also pilot Azure which is Microsoft’s cloud computing service. The company hopes that in few years to come, FPGAs will be available on all its servers.

The company said the move was not intended to compete with Google, but only a move to keep up with the way forward like many Internet giants are doing. The company spends nothing less than $5 to $6 on yearly basis just to maintain the hardware it needs to run the empire it has online.

According to the CEO that took over Microsoft in 2014, Satya Nadella, the move is not just to carry out a research anymore, but a matter of essential priority.

The company’s search engine runs across machines in their thousands using one online service. The major driver for these machines is the CPU which continues to receive improvement from Intel, but has been unable to meet up with the pace that software is going with AI.

Source:wired.com
Source:wired.com

It is rather tasking to try to create a special chip for all the problems that may arise, but with FPGAs, the problem is solved. This is because it makes it possible for engineers to create chips that use less energy consumption and also work faster than the CPU, which is also able to adapt to new and changing demands of business models and technology.

What was contained in the FPGAs was a prototype containing 6 of it and shared among servers. If there is need for more than 6 FPGAs, then there could be a little problem.

This is why Burger’s team had to embark on a second prototype. It made use of one FPGA only and built on a circuit board that connected with each server. But other FPGA boards also linked with others on other servers, making it possible for Bing machine to tap into numerous programmable chip pools.