Source:
Phi Silica, small but mighty on-device SLM
Phi Silica is designed to run locally on Copilot+ PCs using Neural Processing Units (NPUs).
It delivers low latency, high efficiency, and no cloud dependency, making AI features faster and more private.
🔹 Powerful Yet Efficient
- Runs models with 3–7 billion parameters.
- NPUs offer up to 20x more performance and 100x more efficiency than traditional GPU-based AI workloads.
🔹 Integrated Windows Features
- Powers new Windows 11 features like:
- Click to Do (Preview)
- On-device rewrite and summarize in Word and Outlook
- Recall (Preview) for retrieving past user activity
🔹 Developer Access & Multilingual Support
- Developers get access to the Phi Silica API starting January 2025.
- Supports 4k context length and multiple languages including English, Chinese, French, German, Japanese, and more.
🔹 Privacy and Cost Benefits
- Eliminates the need for cloud-based subscriptions.
- Enhances privacy by keeping AI processing on-device.
🔹 Built for Windows
- Tailored specifically for Windows with turnkey integration—no extra tuning needed.
- Based on a Cyber-EO compliant version of Phi-3.5-mini.