
The result was the psychedelic images that the system could create, where ordinary images were infused with skyscrapers, eyeballs, or household items. Roberts also showed off a simple digital synthesizer program he’d been working on, where an AI could listen to notes that he played, and play back a more complete melody from those notes:Įck said the inspiration for Magenta had come from other Google Brain projects, like Google DeepDream, where AI systems were trained on image databases to “fill in the gaps” in pictures, trying to find structures in images that weren’t necessarily present in the images themselves.
Google project magenta software#
The first thing it will be launching is a simple program that will help researchers import music data from MIDI music files into TensorFlow, which will allow their systems to get trained on musical knowledge.Īdam Roberts, a member of Eck’s team, told Quartz that the Magenta group will on June 1 start posting more information about the resources it will be producing, adding new software on its GitHub page, and posting regular updates to a blog. Much in the same way that Google opened up TensorFlow, Eck said Magenta will make available its tools to the public. Eck admitted this during a panel discussion, saying that AI systems are “very far from long narrative arcs.”īut Magenta will aim to create tools to help other researchers, as well as its own team, explore the creative potential of computers. This is no simple task, given that even the most advanced artificially intelligent systems have enough trouble copying the styles of existing artists and musicians, let alone coming up with entirely new ideas themselves.
