We almost made it to the end of the work week. Thursday is everyone’s favourite day, right? So full of promise of the glorious weekend ahead.

Welcome to Mindstream, friend.

We make AI simple and deliver it straight to your inbox, EVERY DAY!

Everything you need to know about AI today in 5 minutes. You can even read it while you rally drive.

TODAY IN MINDSTREAM:

  • 💀 AI being used at funerals!

  • 📚 AI explainer

  • 🎤 Paraylsed woman speaks again thanks to AI

  • 💡 AI-nspirational quote of the day

  • 📸 AI image of the day

AI BEING USED AT FUNERALS

How would you like your entire life summarised at your funeral... by ChatGPT?

Apparently, in Ontario Canada, that’s becoming the new norm.

The funeral industry has been exploring ways to integrate AI and augmented reality (AR) into its practices since OpenAI launched ChatGPT last November, according to the Ontario Funeral Service Association (OFSA).

He means ChatGPT.

The use of AI for obituaries is now "commonplace" for clients of O'Neil Funeral Home in London, Ont. It can be a good tool for inexperienced writers dealing with difficult emotions, said owner Joseph O'Neil.

Okay, grieving family members, I get that it might be difficult to write about your deceased relative - but surely there’s a more sincere way…

It’s the year 2100, and all the readers of Mindstream have had their entire lives summarised by AI and printed on their gravestones. Let’s hope there are no hallucinations…

Would you be happy with AI wriitng your funeral obituary?

Login or Subscribe to participate

📚 AI EXPLAINER

Get smarter about AI with our regular explainers of key concepts and phrases.

LLM

LLM stands for "Large Language Model," which refers to advanced AI models, often based on neural networks, that can understand and generate human language. These models are trained on vast amounts of text data and can perform tasks like language translation, text generation, and question answering.

🎤 PARALYSED WOMAN SPEAKS AGAIN THANKS TO AI

Artificial Intelligence has helped a paralysed woman speak again, through a digital avatar, for the first time in 18 years.

The avatar gets instructed through signals from a Brain-Computer Interface. This involves the use of tiny electrodes implanted on the surface of the patient's brain.

These electrodes detect electrical activity from the part of the brain that governs speech and facial movements. These signals are then translated into the avatar's speech and facial expressions.

Researchers implanted a small, paper-thin sheet of 253 electrodes on the surface of the patient’s brain. The implant was made over a region that was critical for speech.

They then worked to train an AI algorithm in order to calibrate it with respect to the patient’s unique brain signals.

This enabled the computer to learn 39 distinct sounds. A language model similar to ChatGPT was then used to 'translate' the brain signals into sentences, enabling her to communicate. Incredible!

Another AI science win!

💡AI-NSPIRATIONAL QUOTE OF THE DAY

Embrace setbacks as setups for comebacks. Your resilience will transform every challenge into a stepping stone towards success.

Your daily pearl of wisdom from ChatGPT.

📸 AI IMAGE OF THE DAY

Alien planet. Created by Reddit user u/TheNeonGrid

Send in your images for a chance to be featured and promote your work:

👋 GOODBYE

Thanks for reading this far, we think you’re really cool.

Prepare for some juicy Friday Mindstream tomorrow!

Interested in partnering with us? Get in touch: [email protected]

Written by humans.

How was Mindstream today?

vote and help us improve!

Login or Subscribe to participate

Keep Reading

No posts found