- Here.Now.AI.
- Posts
- Spreadsheet Superpowers
Spreadsheet Superpowers
PLUS: Smart streets & decoding dolphins
In this week's edition, we’ve got a tool to unlock your spreadsheet superpowers, Google's latest urban innovation for smarter cities, groundbreaking efforts to decode dolphin language, and a captivating exhibit showcasing the evolution of AI.
Our mission remains clear: making AI accessible to everyone—no technical expertise required. If you find our weekly insights valuable, please share the love by recommending us to your friends and colleagues with just one click. Together, we’re building a community ready to navigate the AI revolution with confidence, curiosity, and enthusiasm. Thanks for being part of the journey.
Now, let’s get started!

Credit: Pixabay
AI Productivity
Chances are, too many of us have spent more time struggling with spreadsheets than we’d like to admit —whether in Excel, Google Sheets, or another version. Now imagine a spreadsheet that does the heavy lifting for you. That’s the idea behind Sourcetable, the world's first "self-driving” spreadsheet. It's built to tackle a common problem: while 750 million people use spreadsheets every day, 80% aren’t comfortable with basics like VLOOKUP or pivot tables.
Sourcetable transforms the typical spreadsheet experience through an AI-powered interface that understands natural language commands. Instead of memorizing complex formulas, you can simply tell (literally) the spreadsheet what you want done. The AI handles everything from creating financial models to building pivot tables, cleaning data, creating visualizations, and analyzing entire workbooks.
How it works: The platform combines traditional spreadsheet functionality with cutting-edge AI, powered by Groq inference chips for real-time performance. Users can interact using keyboard chat or through hands-free voice control. The AI understands data context without requiring you to pre-select ranges and works effectively with messy data.
Accessibility focus: Empowers the 80% of spreadsheet users who lack advanced skills to perform sophisticated data analysis
Advanced capabilities: Can generate Python or SQL code behind the scenes for complex tasks, while maintaining the familiar spreadsheet interface
Business integration: Connects with over 100 data sources and applications, including most that you are likely to encounter at home or in the office.
As CEO Eoin McMillan puts it, Sourcetable is building "the AI spreadsheet for the next billion users," where "everybody will become an analyst" through AI-enhanced tools that augment human capabilities rather than requiring specialized technical training.
Sourcetable has a generous free plan (and is completely free for faculty and students). Those who become power users will pay $20/month.

Credit: Pixabay
AI & Transportation
Google Maps just got smarter with a suite of paid, AI-powered tools designed to help cities and businesses make better decisions about roads, traffic, and location strategy. Announced at the recent Cloud Next conference in Las Vegas, the new features aim to bring real-time intelligence to urban infrastructure and commercial planning.
As cities face growing demands on transportation networks and local governments grapple with aging infrastructure, Google’s tools offer a scalable way to assess, predict, and respond to urban challenges—without the need for manual inspections or outdated reports.
How the new tools work:
Imagery Insights: Combines Google Street View with Vertex AI to detect and analyze infrastructure elements like road damage, stop signs, and utility poles. The tool can flag broken signs, potholes, or other hazards at scale—helping cities and companies prioritize repairs more efficiently.
Places Insights: Helps retailers and developers assess location potential based on area-level trends. For example, it might highlight underserved neighborhoods with upscale dining but no major retailers—ideal for launching a luxury storefront.
Roads Management Insights: Leverages historical and real-time traffic data to identify congestion points, accident-prone areas, and underperforming roadways. Cities can use the insights to update traffic models, install new signals, or plan infrastructure upgrades with precision.
Google Maps Platform General Manager Yael Maguire emphasized that the tools are not intended to replace human workers, but rather to enhance and support existing efforts—improving speed, safety, and accuracy.
As Google integrates AI into its core services, these tools mark a broader shift toward data-driven urban planning. For traffic-heavy cities like New York, São Paulo, or Tokyo, the promise isn’t just convenience—it’s smarter infrastructure at scale.

Credit: NOAA
AI & Nature
Could AI be our Dr. Doolittle moment? For nearly 40 years, the Wild Dolphin Project (WDP) has been studying a single community of Atlantic spotted dolphins in the Bahamas—on the dolphins’ terms. The research project is the longest-running attempt at “immersive journalism” in the animal world, where every whistle and click is recorded to build a living library of underwater communication.
Dolphins have long captivated scientists with their social intelligence and signature calls—vocalizations that act like names. Now, a team from the Wild Dolphin Project, Georgia Tech, and Google is taking the next leap: using AI to decode dolphin language and possibly respond.
Google has introduced DolphinGemma, the first large language model trained specifically on dolphin vocalizations.
Based on Google's lightweight Gemma models, it processes audio recordings from decades of underwater research.
Instead of text, it works with dolphin clicks and whistles—learning to identify patterns and even predict what sound might come next.
The goal isn’t just to interpret dolphin communication, but to build a back-and-forth exchange using synthetic dolphin-like sounds.
This summer, the WDP will begin testing DolphinGemma in the field, using it to match specific vocal patterns with observed behaviors captured on video—such as play, conflict, or curiosity. Researchers will also introduce AI-generated “dolphin words” for familiar objects like seagrass or floating toys, watching closely to see how the animals respond to these novel sounds. You can watch how DolphinGemma works in this short video here.
It’s still unclear whether dolphin whistles function like human words. Slight variations might completely change a sound’s meaning—or not. This ambiguity poses a challenge for LLMs, which rely on consistent symbolic sequences. Still, identifying even partial structures could help crack open a shared vocabulary.
With DolphinGemma, the dream of understanding—and conversing with—another intelligent species is no longer science fiction. It’s floating just below the surface.

Credit: CHM
AI Arts & Leisure
If you live in California or are planning a trip there this summer, here’s something you might consider: a visit to the Computer History Museum in Silicon Valley’s Mountain View. The museum’s latest exhibit, “Chatbots Decoded: Exploring AI,” unpacks the past, present, and future of talking machines—from centuries-old contraptions to today’s phenomenally fluent algorithms.
This exhibit isn’t just about showing off what chatbots can do—it’s a deep dive into how we got here, and what society might gain (or lose) along the way.
What you'll see:
A humanoid robot named Ameca who speaks in multiple languages and fields your questions with unsettling calm.
Interactive displays that walk you through chatbot evolution, from early “talking machines” to 21st-century LLMs like ChatGPT.
Artifacts that reveal how humans have long projected personality onto machines—some charming, others slightly creepy.
The exhibit finds its intellectual backbone in Alan Turing’s famous question: Can machines think? From there, it charts AI’s progress through decades of trial, error, and surprising breakthroughs—highlighting not just what worked, but what flopped. A walk-through video can be seen here.
The exhibit is best enjoyed by high school students and adults with a curious streak, a critical mind, or a soft spot for sci-fi. Whether you’re into programming or just pondering the meaning of intelligence, Chatbots Decoded offers something to engage, unsettle, and inspire.
AI in the News (in case you missed it)
OpenAI is building a social network. Read here.
AI-boosted cameras help blind people to navigate. Read here.
Johns Hopkins students use AI to help develop better baseball bats. Watch here.
A machine using ultrasound and AI can gauge the fattiness of a tuna fish. Read here.
Did you receive this from a friend, colleague, or family member? Please subscribe.
Have you tried a new AI-powered product? Let us know what you think.
Know anyone who might like our newsletter? Recommend it, and help grow our community!
Here.Now.AI Editorial team: Lori and Justin