Seven Must-Read Stories (Week Ending May 24, 2013)
Another chance to catch the most interesting, and important, articles from the previous week on MIT Technology Review.
- Bitcoin Hits the Big Time, to the Regret of Some Early Boosters
The first major conference for the digital currency suggests it is gaining legitimacy, but in a manner disappointing to some early enthusiasts. - Google and NASA Launch Quantum Computing AI Lab
The Quantum Artificial Intelligence Lab will use the most advanced commercially available quantum computer, the D-Wave Two. - Home Tweet Home: A House with Its Own Voice on Twitter
A techie’s San Francisco home has its own Twitter feed. Will yours be next? - One-Time Pad Reinvented to Make Electronic Copying Impossible
The ability to copy electronic code makes one-time pads vulnerable to hackers. Now engineers have found a way round this to create a system of cryptography that is invulnerable to electronic attack. - What 5G Will Be: Crazy-Fast Wireless Tested in New York City
Samsung’s technology for ultrafast data speeds currently requires a truckload of equipment. - Human Embryonic Stem Cells Cloned
Scientists produced embryonic stem cells from the DNA of one person combined with a human donor egg. - In a Data Deluge, Companies Seek to Fill a New Role
A job invented in Silicon Valley is going mainstream as more industries try to gain an edge from big data. <
Keep Reading
Most Popular
Large language models can do jaw-dropping things. But nobody knows exactly why.
And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.
OpenAI teases an amazing new generative video model called Sora
The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.
Google’s Gemini is now in everything. Here’s how you can try it out.
Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.
This baby with a head camera helped teach an AI how kids learn language
A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.