Covering Scientific & Technical AI | Tuesday, October 8, 2024

Mythic AI Targets the Edge AI Market with Its New Smaller, More Power-Efficient M1076 Chips 

It has been a busy seven months for AI chip startup Mythic AI. Seven months ago, the company unveiled its first M1108 Analog Matrix Processor (AMP) for AI inferencing. Then one month ago Mythic AI announced a new $70 million Series C investment round to bring its chips to mass production and to develop its next hardware and software products.

Today, June 7 (Monday), the startup announced the arrival of its second chip design, the new M1076 AMP, which has a smaller form factor and lower power requirements than the original M1076 chip so it can be a better fit and choice for the needs of smaller edge AI devices and applications. The new M1076 chip measures 19mm x 15.5mm, compared to 19mm x 19 mm for the original M1108 chips.

Packing up to 25 TOPs in a single chip, the M1076 AMP will be available in three form factors – a standalone chip, a compact ME1076 22mm x 30mm PCIe M.2 A+E Key card or as an MM1076 M.2 M Key card measuring 22mm x 80mm. A full-sized PCIe card containing up to 16 M1076 chips, which would have the compute power of 400 TOPS and 1.28 billion weights while consuming 75 watts, will also be offered in the future as customers require such configurations, according to the company. The standalone M1076 chip consumes three to four watts of power, but can operate using one to two watts for certain workloads. The power consumption for the ME1076 or MM1076 M.2 cards is less than five watts each for typical workloads. Both chips are built on 40nm process nodes.

Tim Vehling of Mythic AI

The new chip’s arrival only seven months after the debut of the first M1108 chips was planned and is a result of many prospective customers needing a smaller and more power-saving chip, Tim Vehling, the senior vice president of product and business development at Mythic AI, told EnterpriseAI.

“We're basically showing that we can derive and generate different versions of the technology pretty easily” to serve customer needs, said Vehling. “One of the promises of the architecture is that you can quickly come up with bigger or smaller versions and that's what this shows.”

So far, Mythic AI is not shipping either chip to customers, he said, but the plan is to ship the M1076 chips by the end of 2021 or by early 2022.

“We have been sampling the original M1108 chips with customers,” said Vehling.  “We will be providing early samples of this new M1076 product next month. We will see what customers go into production with, which one. But I think this version hits the sweet spot from a size, performance and power point of view.”

Having the second chip design was planned earlier to give customers a variety of chips from which to choose for use in their products, he said.

Mythic AI's new smaller M1076 AI chip

“It turns out that in a [server] world, the larger size [of the original M1088 chip] is not an issue, but if you want to have a strong presence in the embedded world, with embedded products using Arm-based processors,” components need to be smaller, said Vehling. “As we are engaging customers, we basically have to make our product much more scalable from the different form factors.”

Mythic AI is fabless, with all its chips being manufactured by UMC Japan.

Additional AI chips from Mythic AI are in the pipeline and are expected in the future, said Vehling.

“If you look at architecture, we call a tile architecture, you can see an array of these little tiles, the same exact tile is replicated across the chip,” he said. “So, we can definitely size what is needed based on the application we are seeing. One advantage of our architectures is that we use a cheaper and older 40 nanometer process technology. So, it is quite easy for us to do versions or derivatives. It is not a huge effort for us.”

Mythic AI's larger original M1108 AI chip

The M1076 AMP is built to be used in edge endpoints as well as server applications for a wide range of uses including smart cities, industrial applications, enterprise applications, consumer devices and more. The M1076 AMP is also suitable for video analytics workloads including object detection, classification and depth estimation for industrial machine vision, autonomous drones, surveillance cameras and network video recorder applications. It can also support AR/VR applications with low latency human body pose estimation which is expected to drive future smart fitness, gaming, and collaborative robotics devices, according to the company.

The company’s ME1076 PCIe M.2 A+E Key and MM1076 PCIe M.2 M Key cards are expected to be available for evaluation by customers beginning in July 2021.

A Smaller Chip Is a Sensible Move: Analysts

Dan Olds, analyst

Offering the smaller new M1076 AMP chip is a smart decision on the part of Mythic AI, Dan Olds, chief research officer for Intersect360 Research, told EnterpriseAI. “This is their second chip design, but this one seems like it more closely meets size and usage demands from customers.”

Offering it in multiple configurations and performance levels is also a good move, he said. “For AI inference, this gives customers a wide range of form factors – they can use the standalone processor in their own sensor devices to provide edge inference capability or use the big 16-processor card in a data center setting. There is also a third option that sits in a M.2 drive slot of a motherboard, adding more options for customers. All are very power efficient, which is a requirement for anything remote at the edge.”

Like all the other AI chip companies out there competing to create the next big thing in AI chips, Mythic AI is in the hunt to go after market-leader Nvidia, which has made a fortune producing GPU compute accelerators, said Olds. “Nvidia’s GPU revenues have more than doubled in the last year or so, which paints a huge target on their back.”

But while there is plenty of competition, Mythic AI is in a good place, he said.

“Mythic, with their analog processors, is ahead of the game when it comes to inference processors,” said Olds. “This is a very innovative approach and not one that is likely to be copied. If they can execute and really prove out their advantages, they could be one of the big winners in the AI gold rush. One point in their favor is that they recently closed another venture round, this time for $70 million, which will be used to increase production and, from what I can gather, build a sales presence worldwide.”

Alex Norton, analyst

Another analyst, Alex Norton, principal technology analyst and data analysis manager for HPC and emerging technologies with Hyperion Research, said he is particularly excited by Mythic AI’s different approach to AI chip design.

“The key point that grabs me on their announcement and their platform is the energy consumption of the chip,” said Norton. “Accelerators and processors for AI can be very power hungry, so building one that is more power efficient is an important distinction for Mythic. Especially as they are targeting inference applications at the edge where power may be more limited than in a datacenter environment.”

Even more important, he added, is that the company is listening to its prospective customers as it plans its products.

“Mythic’s announcement highlights the fact that the AI chip startups are focusing on specific applications and use cases for their technology, working closely with end users to provide products tailored to their needs,” said Norton. “The low power, compact form fits exactly with their use cases.”

Including the $70 million in Series C funding that Mythic AI brought in in May, the company now has raised $165.2 million.

AIwire