Pocket AI Computer Runs 100B+ Models Offline With Tiiny AI

Tiiny AI has launched the Tiiny AI Pocket Lab, a pocket-sized personal AI computer that runs large-scale artificial intelligence models locally. Growing concerns about data privacy, rising usage costs and cloud dependence now shape how people and businesses adopt AI. The Tiiny AI Pocket Lab answers those concerns by keeping AI on-device.

Tiiny AI introduced the product at CES earlier this week. It drew attention from international media, industry analysts, and developers seeking alternatives to cloud-based AI services. Most mainstream AI platforms rely on remote data centres and usage-based pricing. Tiiny AI Pocket Lab lets users run AI on-device without subscriptions, token fees or continuous internet connectivity.

At CES, Tiiny AI demonstrated the Pocket Lab running large language models with up to 120 billion parameters fully offline. It achieved real-world decoding speeds of 20+ tokens per second. The company says this performance supports everyday use, rather than serving as a technical showcase or proof-of-concept.

As generative AI use grows, companies and individuals reassess the trade-offs between convenience and control. Cloud AI services offer scale, but they often require ongoing fees and send sensitive data to third-party servers. Local AI systems once felt impractical because hardware could not handle them. Recent advances in inference efficiency now make large models viable on compact devices, which has renewed interest in offline AI.

"People are starting to ask where their data goes and how much AI really costs over time," said Samar, GTM director of Tiiny AI. "We believe personal AI should feel more like owning a computer than renting intelligence by the token."

During CES, Tiiny AI positioned Pocket Lab as a dedicated personal AI engine that works alongside existing laptops and desktops. It does not aim to replace them. The device connects through plug-and-play and offloads large-model inference externally. This approach lets even older computers access advanced AI capabilities without hardware upgrades.

Tiiny AI also introduced TiinyOS, its on-device software platform, alongside the hardware. TiinyOS includes a consumer client that lets users download and run open-source large language models and AI agents with a single click. It also provides developer tools that support building and deploying local AI workflows without relying on cloud infrastructure.

Tiiny AI Pocket Lab will launch on Kickstarter in February, with a super early-bird price of $1,399. The company says it set pricing to lower the barrier to local AI adoption, rather than position the device as a premium workstation. Pocket Lab includes 80GB of LPDDR5X memory. The company says the standalone market value of that configuration can exceed $900, reflecting current memory pricing and supply conditions.

Industry observers say products in this price range have often struggled to support huge models locally. Pocket Lab targets 100B-plus-parameter large language models on-device, placing it in a small group of emerging systems that aim to bring high-capacity AI inference to personal, offline hardware. Interest in local-first AI designs also reflects broader debates about data governance, enterprise security and the long-term economics of AI services.

In December 2025, Tiiny AI received confirmation from Guinness World Records that the Pocket Lab will be certified as "The smallest mini PC capable of running a 100B-parameter large language model locally."

Tiiny AI plans to keep engaging developers and early adopters as it expands access to Pocket Lab after its CES debut.

Previous
Previous

ACEMAGIC AI Mini PCs at CES 2026: Faster Local AI Power

Next
Next

Pathenbot Wins CES 2026 Award for Intelligent Robotics