Industry News
Aizip Teams Up with Renesas
Demonstrates First-of-Its-Kind Ultra-Efficient Small Language Models (SLMs) and AI Agents
Aizip, in close collaboration with Renesas, announced today the demonstration of ultra-efficient small language models (SLMs) and compact AI agents on Arm-based micro-processor units (MPUs) for a wide range of applications in edge markets. This advancement paves the way for efficient and effective human-AI interactions in home appliances, enterprise kiosks, and many other edge devices.
Large language models (LLMs) have revolutionized the AI landscape, endowing AI systems with logic and reasoning capabilities beyond simple sensing and perception. By leveraging LLMs and recent advancements in multi-modal representation, AI agents can now interact with their environments to utilize tools or perform tasks based on complex and often ambiguous human commands. However, these advanced AI systems typically require substantial effort to train and significant resources to deploy.
Aizip’s mission is to enable pervasive intelligence by building ultra-efficient, robust, and scalable AI models that can be deployed anywhere, anytime. Aizip pushes the boundaries of efficient AI, uncovering key insights such as data-centric efficiency and AI-design automation. Leveraging its expertise in developing efficient and robust edge models, Aizip has now created a series of ultra-efficient small language models (SLMs) and AI agents, named Gizmo, ranging in size from 300 million to 2 billion parameters. These models support diverse platforms, including MPUs and application processors for a broad range of applications.
New Publication – Machine Learning Systems with TinyML
The open-source ML Systems book serves as an educational resource that aims to make the principles and applications of ML systems accessible to a wide range of individuals. It focuses on TinyML, which aligns perfectly with the mission of the TinyML Foundation. The book includes numerous use cases and examples in the labs, all based on TinyML.
Beyond GPUs: Innatera and the quiet uprising in AI hardware
While much of the tech world remains fixated on the latest large language models (LLMs) powered by Nvidia GPUs, a quieter revolution is brewing in AI hardware. As the limitations and energy demands of traditional deep learning architectures become increasingly apparent, a new paradigm called neuromorphic computing is emerging – one that promises to slash the computational and power requirements of AI by orders of magnitude.
Workshop on TinyML for Sustainable Development
The International Center for Theoretical Physics will be holding a workshop in Sao Paulo, Brazil from July 22 – July 26.
This hands-on workshop focuses on TinyML applications relevant to Latin American researchers, providing training on commercially available hardware optimized for embedded ML deployment. By making TinyML more accessible, especially in the Global South, this workshop will empower researchers to develop localized solutions that benefit their communities.
For more information please see – https://indico.ictp.it/event/10499
EmbedUR – ModelNova is revolutionizing the creation of AI applications for small devices
SILICON VALLEY, Calif. , June 24, 2024 /PRNewswire/ — From the TinyML Summit in Milan Italy, embedUR systems Inc., a leader in embedded systems and Edge AI, and a tinyML Foundation strategic partner, is thrilled to announce the launch of ModelNova, a groundbreaking software hub catering for Edge AI solutions.
ModelNova is a model zoo for pre-trained AI models, optimized for different software frameworks and a variety of low-power hardware platforms with and without native AI acceleration. This innovative platform streamlines Edge AI product development, enabling rapid prototyping and deployment of intelligent applications on edge devices, in a fraction of the time it used to take to develop Edge AI solutions from scratch.
ModelNova addresses a significant challenge faced by developers: the complexity of selecting, creating, training, porting, and optimizing AI models for different hardware platforms, especially low-power IoT devices.
tinyML Foundation and Wevolver – 2024 Edge AI Report
Edge AI, empowered by the recent advancements in Artificial Intelligence, is driving significant shifts in today’s technology landscape. By enabling computation near the data source, Edge AI enhances responsiveness, boosts security and privacy, promotes scalability, enables distributed computing, and improves cost efficiency.
The tinyML Foundation has partnered with Wevolver to create a detailed report on the current state of Edge AI. This document covers its technical aspects, applications, challenges, and future trends. It merges practical and technical insights from industry professionals, helping readers understand and navigate the evolving Edge AI landscape.
World-wide Discord server unites academic and professional tinyML community
Academia is the AI leadership of tomorrow and they are not only driving AI research, applications and a talent pipeline but also present a great opportunity for jobs, internships and mentorships. They have always been a strong pillar of the tinyML Foundation!
As our next phase of strengthening our bridge with academia, we are joining forces and building out a Discord server for a worldwide conversation on tinyML and all things AI in resource constrained environments… jump in!
New datasets from the tinyML Foundation for resource constrained AI and ML
Thanks to the group effort of the Datasets and Benchmarking Working group, The tinyML Foundation is announcing the development of new datasets designed for resource constrained AI and ML applications, which will be freely available to use and contribute to by the community worldwide.
The purpose of this effort is to drive higher quality training data and speed the development and deployment of high quality solutions and provide more comparable benchmarks for developers and integrators of this technology.
The first datasets to be made available will be Visual Wake Words, via GitHub shortly. This will be followed by other key datasets that the tinyML Foundation will curate and develop with our community.
Are you interested? Join the conversation on our discord channel #datasets or email us at datasets@tinyml.org.
Revolutionizing Traffic Safety: The Global tinyML Traffic Hackathon
In a bid to address the rising concern of traffic-related fatalities and injuries, the Global tinyML Traffic Hackathon took place in partnership with the City of San José’s Vision Zero program. With pedestrian fatalities constituting a significant portion of traffic-related deaths, the hackathon aims to leverage the power of energy efficient Machine Learning (tinyML) to detect pedestrians and create innovative solutions for enhancing traffic safety.[1] This article delves into the key details of that hackathon, the technology engineer’s utilized and its potential impact on traffic safety.
Third Annual tinyML EMEA Innovation Forum: Looking deeply at real-world ML applications
From sleep monitoring wearables to face detection models, tinyML is making a big impact on the way we can gather and apply data. As tinyML technologies and ecosystems are gaining more momentum and maturity, more and more applications are being developed and deployed in different verticals.
Third Annual tinyML EMEA Innovation Forum: Exploring tinyML Advancements and Future Directions
The 3rd Annual tinyML EMEA Innovation Forum, which took place in Amsterdam from June 26-28, recently wrapped up. One of the objectives of the event was to unite the tinyML EMEA Community to empower and accelerate Innovation and Partnerships. The event drew a diverse range of participants, speakers, and sponsors, all converging to engage in insightful discussions and share ideas about the most recent developments and promising future of energy-efficient machine learning at the edge – tinyML.
TinyML is bringing deep learning models to microcontrollers
Deep learning models owe their initial success to large servers with large amounts of memory and clusters of GPUs. The promises of deep learning gave rise to an entire industry of cloud computing services for deep neural networks. Consequently, very large neural networks running on virtually unlimited cloud resources became very popular, especially among wealthy tech companies that can foot the bill…
TinyML unlocks new possibilities for sustainable development technologies
In this article, we take a look at two tinyML projects that have the potential to make contributions towards sustainable development goals. While the first project is about revolutionising precision farming, the second one aims to create a network of low-cost sensors for mapping carbon emissions.
Klika Tech Joins tinyML Foundation
Klika Tech, an award-winning IoT and Cloud-native product and solutions development company, has joined the tinyML Foundation as a Strategic Partner to provide technical, cross-industry expertise as the organization advances Machine Learning for on-device data analytics and decision making at the edge.
The tinyML Foundation and its over…
Meet TinyML: The Latest Machine Learning Tech Having An Outsize Business Impact
As device sensors proliferate across every company’s value chain – from new product development through inspection, tracking, and delivery – tinyML is surfacing to provide actionable insights, transforming business as we know it. There are sound economic reasons for all this interest and activity. McKinsey researchers predict IoT will have a potential economic impact of US $4-11 trillion by 2025, identifying manufacturing as the largest vertical (US $1.2-3.7 trillion).
Why tinyML is such a big deal
While machine-learning (ML) development activity most visibly focuses on high-power solutions in the cloud or medium-powered solutions at the edge, there is another collection of activity aimed at implementing machine learning on severely resource-constrained systems.
Known as TinyML, it’s both a concept and an organization — and it has acquired significant momentum over the last year or two.
“TinyML deployments are powering a huge growth in ML deployment,…
Deploying Artificial Intelligence at the Edge: Key Takeaways from SEMI CTO Forum
Rapid advances in artificial intelligence (AI) have made this technology important for many industries, including finance, energy, healthcare, and microelectronics. AI is driving a multi-trillion-dollar global market, while helping to solve some tough societal problems such as tracking the current pandemic and predicting the severity of climate-driven events like hurricanes and wildfires.
Today, AI algorithms are primarily run at large data centers…
Himax Sponsors tinyML Vision Challenge to Foster tinyML Vision Development
Himax and tinyML Foundation share the same vision that tinyML technology can enable a new world with trillions of distributed intelligent devices that can accurately identify and classify what they see or sense in ultralow power and battery-powered features. To accelerate growth of the emerging tinyML field, the open knowledge exchange between developers and industries is of great importance. Hence, this tinyML Vision Challenge competition stimulates…
How TinyML is powering big ideas across critical industries
From cars and TVs to lightbulbs and doorbells. So many of the objects in everyday life have ‘smart’ functionality because the manufacturers have built chips into them.
But what if you could also run machine learning models in something as small as a golf ball dimple? That’s the reality that’s being enabled by TinyML…
Machine learning at the edge: TinyML is getting big
Is it $61 billion and 38.4% CAGR by 2028 or $43 billion and 37.4% CAGR by 2027? Depends on which report outlining the growth of edge computing you choose to go by, but in the end it’s not that different.
What matters is that edge computing is booming. There is growing interest by vendors, and ample coverage, for good reason. …
Tiny ML: The Next Big Opportunity In Tech
Free white paper from ABI Research:
… TinyML aims to solve the issues of both cost and power efficiency by enabling data analytics performance on low-powered hardware with low processing power and small memory size, aided by software designed for small-sized inference workloads. It has the potential to revolutionize the future of the IoT.
Privacy and new functions will make TinyML big
By Stacey Higginbotham
Privacy and smart features that don’t depend on an app will likely drive the adoption of machine learning (ML) on constrained edge devices going forward. That was the message Zach Shelby, CEO of Edge Impulse, and I tried to convey when we sat on a virtual panel at the tinyML Summit this week. …
Machine Learning Is Giving Cancer Detection New Bionic Eyes
Machine learning analysis of images is being used to provide medical diagnosis. And portable solutions using vision at the edge provide solutions that efficient, lower cost, and more timely than clinical solutions. Professor Mohammed Zubair’s research is leading the way in detecting oral cancer. [Don’t miss Professor Zubair’s tinyML Talks on this topic too.]
TinyML Could Democratize AI Programming for IoT
Upgrading microcontrollers with small, essentially self-contained neural networks enables organizations to deploy efficient AI capabilities for IoT without waiting for specialized AI chips.
How TinyML Makes Artificial Intelligence Ubiquitous
TinyML is the latest from the world of deep learning and artificial intelligence. It brings the capability to run machine learning models in a ubiquitous microcontroller – the smallest electronic chip present almost everywhere.
Can artificial intelligence give elephants a winning edge?
Open-source developers and tech giants created the world’s most advanced elephant tracking collars.
“Sara Olsson, a Swedish software engineer who has a passion for the natural world created a tinyML and IoT monitoring dashboard”.
Why tinyML is a giant opportunity right now
The world is about to get a whole lot smarter. As the new decade begins, we’re hearing predictions on everything from fully remote workforces to quantum computing. However, one emerging trend is scarcely mentioned on tech blogs – one that may be small in form but has the potential to be massive in implication. We’re talking about microcontrollers.
tinyML book written by Pete Warden and Daniel Situnayake of Google
Neural networks are getting smaller. Much smaller. The OK Google team, for example, has run machine learning models that are just 14 kilobytes in size—small enough to work on the digital signal processor in an Android phone. With this practical book, you’ll learn about TensorFlow Lite for Microcontrollers, a miniscule machine learning library that allows you to run machine learning algorithms on tiny hardware.
Stanford University Seminar
Evgeni Gousev of Qualcomm and Pete Warden of Google participated in a panel at Stanford University seminar “Current Status of tinyML and the Enormous Opportunities Ahead”.
AI at the Very, Very Edge (EE Times)
When the TinyML group recently convened its inaugural meeting, members had to tackle a number of fundamental questions, starting with: What is TinyML? TinyML is a community of engineers focused on how best to implement machine learning (ML) in ultra-low power systems. The first of their monthly meetings was dedicated to defining the issue.
TinyML Sees Big Hopes for Small AI (EE Times)
SUNNYVALE, Calif. – A group of nearly 200 engineers and researchers gathered here to discuss forming a community to cultivate deep learning in ultra-low power systems, a field they call TinyML. In presentations and dialogs, they openly struggled to get a handle on a still immature branch of tech’s fastest-moving area in hopes of enabling a new class of systems.