Follow Datanami:

Tag: SRAM

d-Matrix Gets Funding to Build SRAM ‘Chiplets’ for AI Inference

Hardware startup d-Matrix says the $44 million it raised in a Series A round today will help it continue development of a novel “chiplet” architecture that uses 6 nanometer chip embedded in SRAM memory modules for ac Read more…

Inference Engine Aimed at AI Edge Apps

Flex Logix, the embedded FPGA specialist, has shifted gears by applying its proprietary interconnect technology to launch an inference engine that boosts neural inferencing capacity at the network edge while reducing DRA Read more…

Datanami