Music is a universal language that can bring people together from all over the world. As emerging technologies help us communicate better, artificial intelligence is beginning to overtake our hearts, minds, and even ears.
AI is opening up a world that users can automate, personalize, and learn from. The music and education sectors are not exempt from the efficiency of emerging technologies. Smart bots like Amper’s A.I. can now compose their own albums, while other intelligent applications like SmartMusic allow users to experiment with composition and production. Yet, before emerging musicians strike the right chord with audiences, they need to foster their own talents.
AI is in line to become the next big learning tool. For now, AI can’t completely take over the creative process, but it’s definitely making music education and creation easier than ever before. But how will machine learning revolutionize music education and inspire continued human innovation in music?
For music students and emerging musicians, artificially intelligent education technology (AIEd) can reorchestrate music education to become more supportive and creative, all while democratizing the medium and the scope musicians have for creating new songs.
Computing sound to rhythm
In traditional music classrooms, teachers share their expertise by offering guidance on the likes of rhythmic patterns, cadence overlaps, and chord progressions via communication and demonstration with physical instruments. But AIEd may prove to be a helpful hand for human teachers in the classroom.
In the United States, one of the first smart classrooms was created by a music professor at Penn State University. In this AI/virtual reality setting called First Class, teaching apprentices could practice with AI students. If educators can use AI to help assist pre-service music educators, they can surely use the technology to assist music students too.
Companies like Third Space Learning are already implementing platforms that offer artificially intelligent software to monitor and improve teaching. In this case, pupils interact with tutors through an online whiteboard as they answer questions.
Analyzing around 100,000 hours of audio and written data from tutorials, the company along with scientists at the University College London are identifying how AI helps augment lessons to foster better student knowledge and performance. In addition, success metrics can be gleaned from raw audio data to show how many problems came up, how useful the session was for the student, and how the tutor rated the session.
Other education companies like Pearson say existing computer systems can already offer one-on-one tutoring and facilitate group discussions. They can also simulate complex environments or situations for learning purposes. In their report Intelligence Unleashed: An Argument for AI in Education, they predict that AI can offer feedback on students’ progress, knowledge status, and even moods in a matter of seconds. Companies can then make musical instruments and supplemental teaching programs with and connected to digitized features and platforms that can monitor, direct, and use data to analyze practice and performance while students are active in music classes and at home.
In fact, Laurie Forcier, the author of Pearson’s AI report, says that “lifelong learning companions” — robotic tools in the form of devices or apps — could ask questions, provide encouragement, offer recommendations, and connect to online resources. If difficulties are encountered by students, like a faulty rhythm, the companion can help guide the performance or even suggest new techniques.
AI from classroom to recording studio
To an extent, everyone learns from real-life experiences that they may not ordinarily encounter in a classroom. But with AI, music education can continue to become democratized in a physical or virtual classroom and through apps and tools in and out of the recording studio.
Today, music learning reverberates outside the classroom and into the music studio as musicians are regularly incorporating AI tools into their own musical development. Douglas Eck and his research team at Google have implemented Project Magenta, a machine learning research program that is helping them understand how computers can create various forms of art and music. The neural learning project offers a synthesizer and a note sequence generation model that interacts with human musicians. Through Google, users can even use plugins for Ableton, a leading digital audio workstation. These tools for musicians, powered by the open source machine learning library Tensorflow, provide insight into the ways musicians can learn, in and out of educational and professional music facilities.
A handful of startups are helping facilitate these learning experiences with apps and tools that nurture music creation. Popgun boasts the first AI that learns from human musicians, possessing skills that can complement and augment music compositions and the whim of its creators. Weav is another startup that can create songs as an experiment for variations in tempo, beat, energy, and mood, depending on the whim of the listener. Lars Rasmussen, Weav’s cofounder, says human artists continue to create its adaptive music. In the future, Rasmussen forecasts AI helping, not replacing human artists entirely.
All the data musicians store in the cloud for these new technological instruments could provide valuable recordings of a music student’s progress. Indeed, AI can help us analyze melodies, beats per minute, and more. But music education is largely intangible and even biased with numerous ways to interpret its styles and character. Even Beethoven was adamant that music would never be perfected because of different interpretations and that each performance could never be entirely replicated or even judged like another.
For the most part, AIEd is still in its nascent stages. While it can store data and automate simple tasks, it can’t answer broader, theoretical, or even cultural questions. Yes, AI will disrupt the musical education system, but used in tandem with real teaching and mentorship, it can be a tool that offers more access and precision to the art.
Music’s most powerful tool: ears or AI?
Artificial intelligence has yet to mimic the human ear in all its nuance and emotional depth. In fact, AI is still learning how to program itself to teach and understand music, too. But the power of human creativity can be nurtured through new, democratized tech that can help everyone, even the most novice player, become a more seasoned and aware musician. In the future, music teachers may give a standing ovation to the more personalized education as AI becomes a lifelong learning companion.
Enrique Cadena Marin (DJ ECM) is one of Latin America’s fastest rising EDM artists and producers.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more