As I always point out I cover many large-scale enterprise use cases of IoT, AR and AI as an analyst at 451 Research but much of what I write here is not directly related to that, but a personal journey in the use of technology.

I had already written about the GoCube, the instrumented Rubik’s cube and how that helped me break my inability to complete the puzzle through great feedback and a sense of knowing what was actually going on the physical world.

Now I am just waiting on the delivery of another Kickstarter backed project which I am looking forward to dive into toe learn to play music on a keyboard.

First though there is some precedent back to early 2010’s to share that I think sums up what can be done with great connected instrumentation (you could call it IoT) of instruments.


Rocksmith

On Consoles and PC’s there is an application called Rocksmith from UBISoft that is designed to make a real electric guitar into a learning tool using a number of interesting approaches that reflect how we, as people, can learn with a bit of computer-based assistance. I have written about, and used, Rocksmith in numerous presentations the past few years and my daughter has now discovered how rewarding it is too.

Rocksmith uses a real electric guitar plugged into a USB port. The cable allows the software to know what not or notes are being played. Even for the basics of tuning a guitar this is very useful. It is with the tunes that it really comes into its own of course. It presents notes scrolling towards the player using both colour coding for the string and its relative position. It uses the fret number to indicate which note to then play. It is possible to also see what notes are coming next.

This approach is used in the more arcade style Guitar Hero and Rockband games that use a guitar with a set of buttons rather than strings. The clever bit is that it is not a game that is trying to cause you to fail, it wants you to do well. It starts by presenting basic notes to the player, if they start to hit those (remember it knows what you actually play and when) it will start adding complexity until you’re playing the full tune. If you keep missing it will start to reduce them again.

You can also isolate a part of the song or a particular riff to practice too. Here you can slow the tune down (but it keeps the correct pitch) and have it repeat and then speed up and/or add complexity as the right notes get hit.

This positive and ever adjusting feedback rapidly accelerates any players ability to learn a tune to some degree or other. It has modes for lead guitar, rhythm and even bass too.

Another feature it can provide is a backing band to jam with, that actually adjusts to what the player plays. Play fast and loud the band keeps up, slow down, they slow down. The player chooses a key and a style of band with drums, keys etc, but the band keeps the whole thing going along adjusting to the style of music being played and providing key changes and suggesting the range of notes that fit too.

There are now stand-alone guitar amplifiers appearing with this sort of jamming technology in, such as the Spark amp (yes, of course, I have pre-ordered/backed this too)


Lumi keys

The next parcel I am waiting to arrive thought is a connected musical keyboard called Lumi. This device is paired with an app, and as with Rocksmith it knows what notes you are playing. It is able to light up the keys that need to be played too, something that the guitars are not geared up to do.

This visual feedback, combined with the audio feedback of playing the notes, should make it easier to start to play tunes. I know some purists will say that we should struggle for the art, do it the old-fashioned way.

However, using technology in this way to match styles of learning and enhance the learning experience should be a priority for all forms of education. It doesn’t really mean any less hours of practice to be get proficient, it just means that the initial learn curve is more effective.


Conclusion

The ability for software and algorithms, even artificial intelligence and machine learning to adjust to the needs of the human user of a system is the foundation of the next wave of industrial robotics, such as with Cobots (Collaborative robots) in part described in one of our freely available 451 Research reports on levels of autonomous robot.

These cobots adjust to the needs and the skills of the people they are working with, just as Rocksmith and Lumi adjust to the skill levels of their users. A cobot has more to do interacting in the physical world than a guitar or keyboard, which are basically passive devices, but the concept is the same and one to look out for.