Loading...
Loading...
Building native and cross-platform mobile applications for iOS and Android
Showing 9 of 9 tools
Lightweight C++ inference for Google Gemma models
gemma.cpp is Google's standalone C++ inference engine built specifically for running Gemma language models without Python or CUDA dependencies. It provides optimized CPU inference using SIMD instructions and Highway library, supports Gemma 2 and Gemma 3 models, and runs on x86 and ARM architectures. Designed for embedded systems, edge devices, and server deployments needing minimal overhead.
On-device ML solutions for mobile and edge AI
MediaPipe is Google's open-source framework for building on-device machine learning pipelines across mobile, web, desktop, and edge platforms. It provides pre-built solutions for face detection, hand tracking, pose estimation, object detection, image classification, text classification, and on-device LLM inference. MediaPipe runs entirely locally without cloud dependencies, supporting Android, iOS, Python, and web browsers.
On-device AI inference engine for mobile and wearable applications
Cactus is a YC-backed open-source inference engine built specifically for running LLMs, vision models, and embeddings on smartphones, tablets, and wearable devices. It provides native SDKs for iOS, Android, Flutter, and React Native with optimized ARM CPU and Apple NPU execution paths. Cactus achieves the fastest inference speeds on ARM processors with 10x lower RAM usage compared to generic runtimes, enabling privacy-first AI applications that run entirely on-device.
AI-powered vision-driven UI automation for web, Android, and iOS
Midscene.js is an open-source UI automation framework from ByteDance's Web Infra team that uses vision-based AI models to understand and interact with interfaces. It replaces fragile CSS selectors with natural language descriptions, supporting web browsers via Playwright and Puppeteer, Android via ADB, and iOS via WebDriverAgent from a unified JavaScript SDK.
MCP server for mobile device automation and testing
Mobile MCP is an open-source MCP server that enables AI agents to automate Android and iOS devices — navigating apps, tapping elements, extracting screen content, and running tests on simulators, emulators, and physical devices. It brings agentic mobile engineering to any MCP-compatible AI assistant.
Google's lightweight ML framework for mobile and embedded
TensorFlow Lite is Google's lightweight ML framework for deploying models on mobile and embedded devices. It supports quantization, GPU/NPU delegation, and runs on Android, iOS, Linux, and microcontrollers. Provides pre-trained models, model conversion tools from TensorFlow and JAX, and hardware acceleration via GPU, Hexagon DSP, and CoreML delegates. Powers on-device ML in billions of Google app installations.
Optimize and deploy AI models on Snapdragon devices
Qualcomm AI Hub is a platform for optimizing and deploying AI models on Snapdragon-powered devices with NPU acceleration. It provides pre-optimized models, profiling tools, and the SNPE SDK for compiling models to run efficiently on Qualcomm's Hexagon DSP and AI Engine. Supports hundreds of model architectures with on-device benchmarking across real Snapdragon chipsets for mobile, IoT, and XR applications.
PyTorch on-device AI for mobile and edge devices
ExecuTorch is PyTorch's official solution for deploying AI models on mobile, embedded, and edge devices. It features a 50KB base runtime, 12+ hardware backends including Apple CoreML, Qualcomm QNN, ARM, and Vulkan, and native PyTorch export without format conversions. Powers Meta's on-device AI across Instagram, WhatsApp, Quest 3, and Ray-Ban Smart Glasses, supporting LLMs, vision, speech, and multimodal models.
Google's app development platform — backend, auth, database, hosting, and analytics in one.
Firebase is Google's comprehensive app development platform providing backend services including real-time database, authentication, cloud storage, hosting, serverless functions, analytics, and push notifications. Used by millions of apps worldwide. Free Spark plan available. Tight integration with Google Cloud Platform.