CloudTalk

Tag: on-device AI

  • Mirai Raises $10M to Supercharge On-Device AI Performance

    Mirai Raises $10M to Supercharge On-Device AI Performance

    The hum of servers filled the air, a familiar sound in the Mirai offices. It was February 19th, 2026, and the team was huddled around a table, poring over thermal tests. The air conditioning struggled to keep up, but the energy in the room was palpable.

    Earlier that day, news broke of Mirai’s $10 million seed round. A significant investment, especially considering the company’s focus on optimizing AI model inference directly on devices like smartphones and laptops. The co-founders of Reface and Prisma, known for their work in facial modification and photo editing, were now joining forces to push the boundaries of on-device AI.

    The core challenge, as explained by lead engineer Anya Sharma, is the computational cost. “Running complex AI models on devices is still a bit like fitting a supercomputer into your pocket,” she said, adjusting her glasses. “We’re focusing on making that process more efficient, reducing power consumption, and improving speed.”

    The funding news was met with a mix of excitement and cautious optimism in the industry. As per reports, analysts at JP Morgan highlighted the potential, forecasting a 30% increase in demand for on-device AI capabilities by 2027. This surge, they noted, is driven by the desire for enhanced privacy and reduced latency.

    Mirai’s approach involves a blend of software and hardware optimization. They’re working on algorithms that can intelligently scale AI models to fit the processing power available on various devices. This is a crucial step, as the market is still very fragmented, with different chip architectures and processing capabilities.

    Meanwhile, the supply chain remains a critical factor. The availability of advanced chips, manufactured by companies like TSMC and potentially SMIC, directly impacts Mirai’s ability to execute its vision. Export controls and domestic procurement policies in countries like China add another layer of complexity, influencing everything from access to the latest GPUs to the overall pace of innovation.

    One of the key strategies is to improve the efficiency of model inference. This means making AI models run faster and with less energy on devices. The company is also working on a new framework that will allow developers to easily integrate AI features into their apps.

    “The goal is to provide a seamless AI experience for users,” said a company spokesperson in a brief statement. And, for once, that seemed like a realistic goal.

    Still, the road ahead is long. The team knows that. But the $10 million seed round provides a crucial runway, allowing them to push forward, one optimization at a time.

  • Mirai Secures $10M to Boost On-Device AI for Smartphones & Laptops

    Mirai Secures $10M to Boost On-Device AI for Smartphones & Laptops

    The hum of the servers was almost a constant presence in the Mirai lab, a low thrum that vibrated through the floor. Engineers hunched over screens, their faces illuminated by the cool glow, running simulations. It was early February 2026, and the team was pushing to finalize the architecture for their on-device AI model inference platform.

    Earlier this year, Mirai, the brainchild of the co-founders behind Reface and Prisma, closed a $10 million seed round. The goal? To make AI models run smoother, faster, and more efficiently on your phone or laptop. No more waiting for cloud processing; the future, they hoped, was immediate.

    “We’re seeing an incredible surge in demand for on-device AI,” said Dr. Anya Sharma, lead analyst at Deepwater Research, during a recent briefing. “The market is projected to reach $50 billion by 2028. It’s a land grab, and Mirai is positioning itself to be a key player.”

    The core challenge, as any engineer will tell you, is efficiency. Mobile devices have limited processing power and battery life. Running complex AI models on these devices requires clever optimization. That’s where Mirai comes in, promising to squeeze every last drop of performance from the silicon. The initial focus is on smartphones and laptops, but the long-term vision includes everything from smart home devices to autonomous vehicles.

    The Mirai team is particularly focused on optimizing for the latest generation of mobile processors. They’re working with chip manufacturers to ensure their platform can take full advantage of new hardware features. It’s a complex dance, balancing performance gains with power consumption, a field where every milliwatt matters. The goal? To deliver experiences that are both powerful and battery-friendly.

    The founders, veterans of the face-swapping app Reface and the photo-editing app Prisma, have a strong background in this very area. They understand how to build consumer-facing AI products that are both fun and demanding from a technical perspective. And they have the experience to back them up.

    The company is targeting a public launch of its platform by the end of 2026. The race is on, and the clock is ticking. The market is hungry for this, or maybe that’s how the supply shock reads from here.

    Still, the industry is watching closely. The success of Mirai will depend not only on its technology but also on its ability to navigate the complex landscape of chip shortages and geopolitical tensions. The supply chain remains a huge question mark.

    For now, though, the team is focused on the immediate task at hand: making AI, truly, mobile. And that, in itself, is a huge challenge.

  • Quadric: On-Device AI Chips Revolutionize Computing

    Quadric: On-Device AI Chips Revolutionize Computing

    The hum of servers used to be the sound of AI. Now, it’s the quiet whir of a chip, nestled inside a device. At least, that’s the bet Quadric is making. The company, aiming to help companies and governments build programmable on-device AI chips, is riding the wave of a significant shift in the artificial intelligence landscape. The move away from cloud-based AI to on-device inference is gaining momentum, and Quadric seems well-positioned to capitalize.

    Earlier this week, during a call with investors, a Quadric spokesperson highlighted their focus on fast-changing models. This means the ability to run updated AI algorithms locally, without constantly pinging the cloud. It’s a critical advantage in fields like edge computing, robotics, and even national security, where latency and data privacy are paramount.

    The technical challenges are significant. On-device AI demands powerful, yet energy-efficient, processing. Traditional GPUs, designed for the cloud, often fall short. Quadric’s approach involves developing specialized chips. These chips are designed to handle the complex computations needed for AI models right on the device. This is a bit of a departure from the conventional wisdom of recent years.

    “The market is definitely moving in this direction,” said John Thompson, a senior analyst at Forrester, in a recent interview. “We’re seeing increased demand for low-latency, secure AI solutions, and on-device inference is a key enabler.” The analyst also noted a shift in procurement priorities in key markets, especially in light of export controls and domestic supply chain policies.

    Consider the details: Quadric’s roadmap includes the M100 and M300 chips, with projected releases in 2026 and 2027, respectively. The company is targeting a performance increase of up to 5x compared to existing solutions, as per internal projections. But the true test will be the real world, and how well these chips can handle the dynamic demands of AI models.

    Meanwhile, the supply chain remains a critical factor. The availability of advanced manufacturing processes, particularly those offered by TSMC, could be a bottleneck. The U.S. export rules and domestic procurement policies also play a significant role. It’s a complex equation, where innovation meets the realities of global politics and manufacturing capacity.

    Still, the shift towards on-device AI is clear. Quadric is among the companies poised to benefit. It’s a space that’s going to be interesting to watch as the year progresses.

  • Quadric: On-Device AI Chips Revolutionize Computing

    Quadric: On-Device AI Chips Revolutionize Computing

    The hum of servers, usually a constant drone, seemed to quiet slightly, or maybe that’s how the supply shock reads from here. Inside Quadric’s engineering lab, the team was running thermal tests on the new M300 chip, slated for release in early 2027, according to their roadmap. The goal: to enable AI processing directly on devices, bypassing the need for constant cloud connectivity.

    It’s a strategic pivot, as the industry begins to recognize the limitations of cloud-dependent AI. Quadric, founded with the aim of helping companies and governments, sees the potential in programmable on-device AI chips. They’re designed to run fast-changing models locally. This means quicker response times and enhanced data privacy, key selling points in an increasingly security-conscious world.

    “We’re seeing a significant shift,” said analyst Maria Chen from Forrester, during a recent industry briefing. “The demand for on-device inference is surging, and companies like Quadric are well-positioned to capitalize. We project the market to reach $15 billion by 2028.” That’s a bold number, considering the sector was still nascent just a few years ago. But the need is there: think of self-driving cars needing instant reactions, or edge devices in remote locations with limited bandwidth.

    The technical challenges are significant. Building these chips requires advanced manufacturing, and the global supply chain, still recovering from recent disruptions, adds another layer of complexity. Export controls also play a major role. Quadric, like many in the industry, has to navigate the complex web of US and international regulations. The company is likely looking at options for domestic procurement policies in China, which could influence their strategy.

    Earlier today, the team was reviewing the performance metrics for the M100, which is already in use. The focus now is on the M300, which promises a substantial performance leap. The engineers were huddled around monitors, analyzing the data. The atmosphere was focused, the air thick with anticipation. The M300 is expected to offer a 4x performance increase over the M100, according to internal projections.

    The shift to on-device AI is more than a technological evolution; it’s a strategic move. It gives companies and governments greater control over their data and operations. Quadric is, in a way, at the forefront of this transformation. Their success will depend on their ability to deliver on their promises, navigate the complex regulatory landscape, and, of course, stay ahead of the competition.