top of page


Vanderbilt's ChatGPT Email After Michigan Shooting

Welcome back to AI Now, the newsletter where we round up the most important tools and innovations from the world of artificial intelligence.

Samsung recently unveiled a revolutionary new feature for its Bixby mobile assistant: the Custom Voice Creator. This breakthrough technology can create an AI-generated copy of your voice to answer calls, as well as other Samsung capabilities like Text Call, which transforms voice calls into written text and transcribes what the caller says.

Plus, users now have the ability to customize Bixby's wake-up phrase and ask it to play music depending on their exercise; all these features are now available offline too! To the five people that use Samsung, get ready for an amazing AI experience!

Now, let's get into today's issue!


🔧 Today's Top Tools


  • Find top prompts, produce better results, save on API costs, and sell your own prompts.

  • Think your prompts are worth money? Click on the image below and find out!

AI SEO Outlines

  • Generate amazing SEO content in seconds

  • Enter a keyword (or topic) on which you're going to write your blog post and this tool will output everything you need to write in an SEO-friendly article

  • The prompt we used below is "AGI," click on the image to try it for yourself

Early Access:

  • Generate an infinite number of texture variants from a text prompt for a 3D model

Bad Cook Club

  • Generate a recipe using ingredients you already have available


📊 News

Vanderbilt Apologizes For Using ChatGPT in Email on Michigan Shooting

Vanderbilt University apologized for sending a consoling email to students after the mass shooting at Michigan State University which was generated by an AI chatbot. The Peabody College of Education and Human Development administrators who signed the message sent out an apology saying it was "poor judgment" to use ChatGPT in such a time of sorrow. Laith Kayat, a Vanderbilt student from Michigan whose sibling attends Michigan State, said the use of AI was “disgusting” and urged university officials to show genuine, human empathy instead. The two administrators are stepping back from their roles while the university investigates.


📊 Other News

  • ChatGPT launches boom in AI-written e-books on Amazon (link)

  • Bain & Company announces services alliance with OpenAI (link)

  • Welcome to AI warfare (podcast link)


🔑 Use Case


🧠 Learn

Facebook teaches language models how to use tools – and the results are convincing!

Researchers from Facebook AI Research and the Universitat Pompeu Fabra have developed a groundbreaking language model that uses APIs to improve itself. This innovative approach allows for the models to utilize a variety of tools, making them more efficient in tasks such as math exams or coordinating complex schedules. The researchers tested their 6.7bn parameter 'toolformer' model against hard baselines, including a 66B GPT3-replication OPT model and the stock 175B GPT3 model. Remarkably, it outperformed both baselines by a considerable margin, demonstrating its effectiveness as an AI tool. By showing how machine learning models can be improved through API integration, this research has opened up new possibilities for future development in artificial intelligence.

What is an API? API stands for Application Programming Interface. It is a set of routines, protocols, and tools for building software applications. APIs specify how software components should interact and are used when programming graphical user interface (GUI) components. They also provide developers with access to various services, such as databases or web services, so they can build powerful applications quickly and easily. In addition, APIs allow developers to create programs that are platform-independent – meaning they can run on any type of device or operating system.

What is Toolformer? “A model trained to decide which APIs to call, when to call them, what arguments to pass, and how to best incorporate the results into future token prediction”. The model is based on a pretrained 6.7b parameter ‘GPT-J’ model and, despite its small size, outperforms many much larger models, including

How they did it: They use a language model to build Toolformer’s dataset. Specifically, they take a dataset of plain text, augment that data with API calls in the text, then check if the calls a) worked and b) were useful and if they were, then weave that back into the dataset. They use the resulting dataset to finetune the model so it can learn to use APIs. “Moreover, as API calls are inserted in exactly those positions and with exactly those inputs that help M predict future tokens, finetuning… enables the language model to decide when and how to use which tool, based purely on its own feedback.”

The cleverest part of this: This approach is API agnostic – you can expose arbitrary APIs to the model using this method, so it will generalize to whatever tools you have lying around. Here, Facebook experiments with five tools: a question answering system, a Wikipedia search engine, a calculator, a calendar, and a machine translation system.

bottom of page