You are here:
AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN
Blumind's all analog AMPL™ neural signal processor IP for edge AI applications is designed on standard advanced node CMOS technology and can be integrated into high volume SoC and MCU products.
Ideal for always on voice detection, key word spotting. speech-to-intent commands (10), time series health or industrial sensor data etc.
Input from analog (or digital sensors) our IP has a fall-through neural network architecture that results in only active neuron consumming power.
Our NN core is tiny taking ~1mm2 and about 2x that area including analog preprocessing and 2 second audio buffer options.
Ideal for always on voice detection, key word spotting. speech-to-intent commands (10), time series health or industrial sensor data etc.
Input from analog (or digital sensors) our IP has a fall-through neural network architecture that results in only active neuron consumming power.
Our NN core is tiny taking ~1mm2 and about 2x that area including analog preprocessing and 2 second audio buffer options.
查看 AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN 详细介绍:
- 查看 AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN 完整数据手册
- 联系 AI Inference IP. Ultra-low power, tiny, std CMOS. ~ 100K parameter RNN 供应商
AI IP
- RT-630 Hardware Root of Trust Security Processor for Cloud/AI/ML SoC FIPS-140
- RT-630-FPGA Hardware Root of Trust Security Processor for Cloud/AI/ML SoC FIPS-140
- NPU IP for Embedded AI
- RISC-V-based AI IP development for enhanced training and inference
- Tessent AI IC debug and optimization
- NPU IP family for generative and classic AI with highest power efficiency, scalable and future proof