By Jane Lanhee Lee
OAKLAND, California (Reuters) – Silicon Valley chip startup Ayar Labs raised another $25 million after its latest funding round closed, as demand has increased for faster, more energy-efficient computing for artificial intelligence applications, the company said on Wednesday.
The funding will be used to accelerate commercialization of Ayar Labs’ silicon photonics technology, which uses light instead of electricity to transfer data between chips or servers, it added.
The funding round, which closed in April this year, brings in new investors such as Capital TEN and VentureTech Alliance which both have deep connections to Taiwan, Ayar Labs CEO Charles Wuischpard said in an interview.
Silicon Valley computing giant Nvidia Corp and venture capital firm Tyche Partners are among existing backers that increased their investments. GlobalFoundries Inc and Intel Corp are also strategic investors.
“I think things like ChatGPT have just accelerated this trend” of betting on photonics technology, Wuischpard said, referring to the chatbot created by OpenAI and released in November 2022 with the backing of Microsoft Corp.
“Investments in AI infrastructure are accelerating and growing like crazy with the introduction of generative AI models and redefining data center architectures.”
The company said its market valuation has increased from last year when it raised $130 million, but declined to elaborate.
Ayar Labs’ technology is nearly ready to be used in data centers to ease information transmission bottlenecks using electrical methods, Wuischpard said.
For decades light has been used to transmit data through fiber-optic cables, including undersea cables, but devices used to create or control light have not been as easy to shrink as transistors.
A small chip that translates data between electrons and photons is a key part of the technology, Wuischpard said. Those chips are made domestically by GlobalFoundries, based in Malta, New York, which has enabled much of the development in silicon photonics, he said.
(Reporting by Jane Lanhee Lee; Editing by Richard Chang and Stephen Coates)