⚡ Quick News
- Apptronik Secures $350M to Enhance Humanoid Robotics Texas-based Apptronik plans to expand its bipedal robots with a significant funding boost. The Series A round was supported by partnerships involving Google DeepMind, aiming for commercial operations by 2026.
- Chinese Researchers Develop High-Speed Robotic Dog A robotic dog developed in China boasts the ability to sprint 100 meters in under 10 seconds. This achievement could transform search-and-rescue missions and factory automation with its high-speed capabilities.
- Hugging Face and Physical Intelligence Release Pi0 for Task Adaptation Hugging Face collaborates with Physical Intelligence to offer Pi0, aiding the fine-tuning of robotic functions for everyday tasks. This model allows quick adaptability with minimal data, advancing general-purpose robot control.
- Meta Consider Joining Forces with FuriosaAI In an effort to diversify its AI hardware, Meta may partner with FuriosaAI, a South Korean chipmaker. An agreement could shift the tech giant's dependence from NVIDIA as early as this month.

OpenAI CEO Sam Altman has announced a significant change in strategy, shelving the anticipated o3 model in favor of integrating its capabilities into the forthcoming GPT-5. This decision marks an effort to streamline OpenAI's offerings and to refine AI accessibility. The upcoming GPT models are designed to enhance user experience by integrating various functions, catering to both free and paid tiers.
Key Highlights:
Key Highlights:
- GPT-4.5, codenamed "Orion," will be the last model before the chain-of-thought integration seen in GPT-5.
- GPT-5 consolidates various OpenAI technologies, geared towards a broad range of uses including voice and research.
- Tiered system ensures free users access "standard intelligence," while Plus and Pro users gain advanced features.
- Models will be released shortly, with GPT-5 expected in a few months.
If you're enjoying Nerdic Download, please forward this article to a colleague. It helps us keep this content free.

In an unexpected move, Elon Musk has proposed altering his acquisition offer to buy OpenAI for nearly $100 billion. He conditioned retraction of the offer on OpenAI's continued operation as a non-profit entity. This development arrives amidst ongoing legal disputes where Musk accuses OpenAI of drifting from its core mission since inception. OpenAI is resisting the deal, arguing that Musk’s strategy potentially endangers its operational integrity.
Key Highlights:
Key Highlights:
- Musk proposes OpenAI remain a non-profit to retract his buyout offer.
- Legal battles between Musk and OpenAI pivot around mission deviations and operational integrity.
- OpenAI's board has to decide on the acquisition by May 10.
- OpenAI is shifting focus towards models like GPT-4.5, indicating strategic pivots beyond scaling hardware resources.
- Musk demands full due diligence, raising concerns over potential exposure of trade secrets to his AI venture, xAI.

Anthropic, the rising star in AI technology, is set to unveil a new hybrid AI model that promises to transform industry standards. Integrating both deep reasoning and traditional rapid-response capabilities, this model allows dynamic scalability over computational resources. Users can modulate the model's operations via a "sliding scale," optimizing cost, speed, and efficiency based on specific needs.
Key Highlights:
Key Highlights:
- The new Anthropic model can switch between extensive reasoning and fast-paced processing tasks.
- Designed to outperform OpenAI’s o3-mini, it excels in coding, particularly in managing large codebases.
- The innovative sliding scale system offers unprecedented control over compute resource allocation per query.
- The model's coding proficiencies are notable, matching expert-level developer needs.
- Slated for public release shortly, it aims to underscore Anthropic's dominance in the evolving AI field.

Chinese AI start-up DeepSeek has captivated the tech industry by developing a high-impact AI system with significantly fewer resources—using merely 2,000 GPUs in contrast to the usual 16,000. By adopting innovative strategies like the "mixture of experts" and novel mathematical optimizations, DeepSeek dramatically reduced its computing costs to approximately $6 million. This figure is a mere fraction of the expenses incurred by tech giants such as Meta. DeepSeek's approach highlights a shift in focus towards cost efficiency in AI development.
Key Highlights:
Key Highlights:
- DeepSeek employs the mixture of experts (MoE) model, which divides its AI into smaller, specialized units managed by a generalist model, cutting down computational demands.
- The company optimizes calculations using 8-bit memory rather than the standard 16-bit, significantly reducing computing costs.
- Despite lower precision inputs, precision is increased selectively, storing multiplication results in 32-bit memory to maintain accuracy.
- Advanced coding techniques maximize hardware efficiency, allowing better resource utilization.
- The final model training cost was $6 million, a stark contrast to the expenses faced by companies like Meta.
🛠️ New AI Tools
- Refound: AI Leadership Coaching Transform your leadership skills with AI-driven coaching grounded in insights from 100,000 leaders. This tool addresses critical aspects of personal and organizational development.
- Phind: AI Search Engine for Developers Phind offers developers accurate answers, code examples, and guides through its AI-powered search engine. It enhances productivity by providing precise and contextual search results.
- Electe: AI for Business Data Optimization Electe utilizes AI to help businesses unlock and maximize their data's potential. This tool facilitates data-driven decision-making in various industries.
- Benchmark Email: Intelligent Marketing Automation Enhance your email marketing strategies with Benchmark Email's AI-driven automation, templates, and CRM tools. This solution boosts marketing efficiency and outreach.