Google is embarking on a new experiment to showcase AI collaborations across diverse disciplines. Announced on the company's Labs site, Lab Sessions will highlight Google's partnerships on experimental AI projects with creators, academics, entrepreneurs and more.
The first three Lab Sessions provide a glimpse of the initiative's potential. Digital musician Dan Deacon teamed up with Google researchers to generate music and visuals using textual AI models. The collaboration produced a unique performance piece for Google I/O 2023 incorporating tools like text-to-music model MusicLM.
Rapper Lupe Fiasco worked with Google to build lyrical writing aids leveraging large language models. This led to TextFX, a set of AI writing tools for rappers and lyricists to enhance their creative process. Users can try crafting verses with TextFX online.
Finally, Google collaborated with students from Georgia Tech and RIT/NTID on PopSignAI, an educational mobile game teaching sign language basics through AI computer vision. With machine learning models, the game recognizes and scores American Sign Language gestures.
These first Lab Sessions illustrate Google's goal of shaping AI tools to match human creativity and needs. While the projects are experimental, they provide templates for AI that augments human expression rather than replaces it. Google states that more Lab Sessions are forthcoming as it continues pursuing opportunities for inclusive, collaborative AI research.
The company recognizes that creative AI requires input from non-technical experts like artists and advocates. By transparently developing and testing AI alongside these communities, Google aims to address flaws like representation gaps and build more thoughtful applications. If the first Lab Sessions are any indication, the initiative could yield some of Google's most human-centered AI to date.