Task-Taming Tech

Plus: AI sleep buds & signing made simple

With massive cuts in federal spending on research underway, it’s nice to see the AI world stepping up. OpenAI, the team behind ChatGPT, just launched NextGenAI, a first-of-its-kind collaboration with 15 top research institutions, including CalTech, Duke, Harvard, Howard, and MIT.

With $50 million in grants, computing power, and API access, NextGenAI aims to drive research breakthroughs and advance education. Let’s hope we see more initiatives like this in the coming months.

In this week’s edition, we’ve got an AI agent for early adopters, AI sleep science, smart tech for teen wellness, and learning sign language with AI. 

Let’s get started!

Credit: Google’s ImageFX

AI Agents

Move over Alexa and Siri. Imagine a digital assistant that does more than answer your questions -- it actively searches the web and completes tasks for you. AI agents are no longer just a concept—they're rapidly becoming a reality, transforming how we interact with technology.

Proxy is an AI agent that offers a free way to experiment with autonomous AI. Unlike OpenAI's Operator, which requires a hefty $200 monthly Pro subscription, Proxy lets early adopters try their platform at no cost.

How it works: Think of Proxy like a super-powered chatbot that doesn't just talk – it acts. Start with a simple instruction like "Track the top five tech news stories" or "Monitor LinkedIn job trends." Now, here's where things get interesting: instead of just returning a quick answer, Proxy opens a window showing you exactly how it's gathering information.

We were fascinated to watch the agent navigate different websites, scanning content, and synthesizing information in real-time. It's like having a transparent digital detective working for you!

While today's AI agents are focused on information-gathering, the future is even more exciting. Soon, these digital assistants will:

  • Make travel reservations

  • Handle online purchases

  • Schedule appointments

  • Manage your subscriptions

Truthfully, we're not yet ready to hand over our credit cards to these digital helpers. But, if you're curious (like us!) about where technology is heading, it's worth taking Proxy for a spin. Think of it as a preview of how we'll all interact with AI in the coming years. And the best part? You can try it now, for free, while it's still in its launch phase.

Credit: Tone Buds

AI for Sleep

Do you have trouble falling asleep? Soon, you’ll be able to get a bit more shut eye with the help of AI. NextSense is launching Tone Buds, the first AI-driven, biosensing sleep earbuds designed to improve sleep quality by reading and responding to your brain’s electrical activity in real time.

Unlike traditional sleep trackers that rely on heart rate and movement data—often imprecise—Tone Buds use EEG technology to directly measure brainwaves, offering a more accurate picture of sleep health.

Sleep deprivation is a national health crisis—70 million Americans struggle with chronic sleep issues, according to the CDC. Women tend to struggle more with insomnia and report poorer sleep quality, while men are more likely to deal with sleep apnea.

Here’s how Tone Buds work:

  • Responsive sleep optimization – Tone Buds track brain activity and adjust in real-time, playing sounds that help promote deep, restorative sleep.

  • Medical-grade sensors – Safe for daily wear, the earbuds meet rigorous safety standards and enhance slow-wave activity, which is linked to better brain health.

  • A tailored sleep experience – Users can drift off to their favorite audio or Tone’s curated soundscapes, designed to align with brain rhythms.

Worried about your privacy and where your data might end up? NextSense says all data is encrypted, user-controlled, and never collected without explicit consent. Users can review or delete their data at any time.

Tone Buds are available for pre-order at $299, with shipments expected in Q2. A monthly subscription covers sleeve replacements to maintain peak performance and accurate readings. With Tone Buds, better sleep isn’t just a dream—it just might be music to your ears.

Credit: Pixabay

AI & Mental Health

As schools across the US grapple with rising mental health problems among students, a startup called Sonar Mental Health has come up with a tech-driven solution that blends AI with human intervention.

At the heart of this approach is Sonny, an AI-powered well-being companion that provides real-time, personalized support and connects students to professional help when needed.

With 17% of high schools lacking a counselor, many students are left without critical support. Sonar fills this gap by offering 24/7, judgment-free assistance without increasing the burden on school resources.

Founded on personal experience, Sonar’s team—made up of clinicians, researchers, technologists, and youth leaders—aims to reshape student mental healthcare by identifying challenges early and preventing crises before they escalate.

Sonny isn’t a therapist, but it acts as a bridge to real help. Here’s how it works:

  • Students text Sonny when they need support.

  • AI suggests a response, but every message is reviewed and sent by a trained Wellbeing Companion—a human providing guidance and reassurance.

  • If more help is needed, Sonar connects students with licensed therapists, school counselors, or mental health professionals—eliminating common barriers that delay care.

Since launching its first school partnership in January 2024, Sonar has expanded to support over 4,500 students across nine districts. A team of six trained professionals—specializing in psychology, social work, and crisis response—ensures that every conversation is handled with care.

By combining AI efficiency with human empathy, Sonar offers a scalable, proactive approach to student mental health—ensuring no student has to face their challenges alone.

Credit: Signs-ai

AI for Good

Are you someone who uses sign language? Ever wanted to learn ASL?

AI chip giant NVIDIA has helped to launch Signs, a free, AI-driven platform designed to make American Sign Language (ASL) learning more accessible. Whether you're new to ASL or fluent, Signs provides real-time feedback and a growing open-source dataset to enhance learning and engagement. All it takes is your webcam and a desire to learn.

Many hearing parents of deaf children struggle to learn ASL, creating barriers to communication. Signs bridges this gap by offering interactive learning and encouraging fluent signers to contribute their knowledge, making it both a learning tool and a community-driven resource.

How it works:

  • Users enable their cameras to receive real-time AI feedback while practicing.

  • Right-handed users select the blue hand to follow; left-handed users select purple.

  • Choose the level you’re comfortable with -- from beginner to expert.

  • Fluent ASL speakers can also upload videos, helping to improve accuracy. 

.At launch, the platform features 100 distinct signs, with plans to grow to 1,000.

ASL relies on more than hand movements—facial expressions and body language are crucial. While the initial dataset focuses on hand and finger positions, NVIDIA is exploring ways to incorporate these non-manual signals using AI and motion capture.

The data could also enable Nvidia to develop new ASL-related products in the future. By making ASL learning interactive, community-driven, and open-source, Signs empowers more people to communicate while expanding resources for assistive technology.

AI in the News (in case you missed it)
  • Why parents are teaching their gen Alpha kids to use AI. Read here.

  • World's largest call center using AI to 'neutralize' Indian employees' accents. Read here.

  • Inside the wild west of AI companionship. Read here.

  • Are chatbots of the dead a brilliant idea or a terrible one? Read here.

Did you receive this from a friend, colleague, or family member? Please subscribe.

Have you tried a new AI-powered product? Let us know what you think.

Know anyone who might like our newsletter? Recommend it, and help grow our community!

Here.Now.AI Editorial team: Lori and Justin