Florian Dohmann smiles at the camera

03 August 2022

"The Kaleidofon helps people use their own individual skills."

What can AI contribute at the intersection of creativity and social engagement? This question has been addressed by the Berlin-based company Birds on Mars. Birds on Mars is a consultancy and agency that specialises in data and artificial intelligence. Together with the inclusive artist collective barner16, "the Birds" have developed the Kaleidofon, an interface for artistic and, above all, inclusive work. We talked to founder and Chief Creative Florian Dohmann about how exactly this and the artificial intelligence behind it works, and what other projects "the Birds" are working on in and outside of business.

 

Your project "Kaleidofon" is really unique! How did you come up with the idea for it? What inspired you?

As is so often the case in data and AI, the whole thing came about by accident: One of my colleagues has known barner16 staff for some time. barner16 is an inclusive network for cultural productions from Hamburg that we are working on the project with. There, art meets social issues and one of these areas is music. My colleague then had the idea to talk to their colleagues at barner16, as AI would actually be perfect for creating inclusive momentum in the creative process. In other words, to consciously use the neutrality of an algorithm to make music together with everyone, whether with or without a so-called disability. In addition, we at Birds on Mars have always been committed to showing what new perspectives there can and should be on the topic of artificial intelligence, in addition to the many business projects we do. We see ourselves as an inspiration for other organisations, the market and the scene. We want to show that AI can do much more than optimise advertising and that it absolutely has to be used in the field of AI and social issues. I myself did community service in the social sector and am very influenced by it. This valuable experience has sharpened my view of the importance of these areas. These are the reasons why we wanted to show, among other things, that artificial intelligence can absolutely be used in the social sector as well. The Kaleidofon ultimately set out to help people with and without so-called disabilities use their own individual skills to make music together.

That sounds very exciting. What does the cooperation between the two parties look like? What does a day at work together look like, for example?

The whole thing has come about under the peculiar circumstances of the pandemic. At the beginning, it mainly looked like we had to figure out together who we wanted to develop the Kaleidofon for. We realised very quickly that we wanted to choose an initial user for the development, in this case Katharina. Her carers have been thinking for a while about how to give Katharina the chance to go on stage and perform live. In principle, the Kaleidofon should be open to lots of people at the same time, so that they can make music together; the community aspect is so essential. In classic AI fashion, however, we first started with an MVP, a Minimal Viable Product, and in the process decided to devote ourselves entirely to Katharina for the time being. We then first had to show Katharina how a browser interface works. Since we worked on it together from a distance, we always put the new versions of the Kaleidofon on the server, which she could then control through the browser using a keyboard and mouse. At first, her carer Andreas helped her a lot.

Overall, I think it is fair to say that we have worked together in small steps in a very iterative way, and we are still working. Andreas always serves as a communicator between Katharina and us. To give you a brief idea of what this means: Katharina has some sounds through which she can express herself; she understands a lot and basically she has different ways of expressing herself through her own individual spoken language. Our team had to learn that first and get a feeling for each other. For over a year now, this has been going on step by step, from one prototype to the next, and in the meantime there have also been some live concerts. A very special experience in our collaboration was that we went in with different styles. We are used to having a lot of things synchronised, one video conference and vote after the other, having Trello boards, etc., just like a lot of businesses in the tech sector. To be clear, our colleagues at barner16 also have an affinity for it and were immediately into it, but they still have a different rhythm. When they are sitting in their lab, a client can come in at any time and require support and attention. So it's a different way of working, which is very inspiring and valuable. Nevertheless, we first had to get used to it and at that point we learned a lot and had a lot of fun.

Would you say that this kind of collaboration inspired you to change your own way of working even apart from the project? Or do you still stick to the familiar procedures with timed meetings etc.?

That's a good question, but to be honest, I don't really know. On a small scale, it certainly makes a difference, if only through the project work on the Kaleidofon, but not all the Birds are involved in that either. I can speak for myself personally that it is incredibly enriching, especially mentally. I'm a big fan and advocate, also influenced by my civil service time, that everyone should spend some time in the social sector, that would do us all a lot of good and would make a big impact on the economy and society. I am very happy to be shown again what incredibly valuable work is being done in some places. Since I am also an entrepreneur and therefore have a responsibility, it has certainly also inspired me to continue to develop our diversity within the context of a classic business enterprise. We are never finished with this, as it is a continuous process that you first have to take into account and where things do not move on their own, but where you have to be very proactive, especially in your responsibility as an entrepreneur.

Now back to the Kaleidofon: How exactly does it work and, above all, how does the artificial intelligence behind it work?

First, I would divide it into the vision, the goal and the current status. The whole thing is a process which, by the way, is also documented on kaleidofon.ai or can be seen in a piece on arte tracks.

Generally, we keep a record of the project's milestones over time so that people can also follow what is happening and how it is developing. The vision is basically that an AI acts both at the interface level, namely that it is able to understand, absorb and respond according to the individual skills and needs of the people working with the system. The AI should then be able to combine the various inputs and create a generative soundscape that processes every individual component and creates a complete work from it. That's the goal. Where are we now? In front of us: There is, in classic terms, the input level, the processing level and the output level. In the beginning, we were very much concerned with the output, looking at how we could use generative sound networks to generate sound that would be exciting for the people giving the input. However, we quickly realised that our user was primarily interested in being understood. In the first step, the output does not necessarily require a generative AI, instead it is explicitly triggered at the moment, that is, more or less via classical programming. Katharina has previously selected these output sounds, which she can now control and use. She can now control samples like with a synthesiser. How does that work? This is where the input level comes into play, which we have been working on for the last six months in particular. This is what is known as a classifying AI: a classifying machine learning system, which was trained with sample data from Katharina by teaching the AI to trigger certain classes. You can think of it as pressing key 1 when playing the piano, that would be her individual sound expression or key 2 and so on. She does this with her spoken language. For this we needed a lot of sample data, which we fed into the AI and used to train it. Now Katharina is able to trigger certain output sounds in various situations via her sounds. But at the same time we also use her actual sounds, because Katharina would like that too, and these are then heard live on stage as part of the output. She will then be on stage together with the band "Kaleidomars".

Are there any more live concerts planned? What are the plans for the Kaleidofon?

We very much want to make the Kaleidofon available to others collectively and use the Kaleidofon together with other people to bring in other individual supposed disabilities or characteristics. I would also like to be able to jam together with Katharina at some point. We don't distinguish between "normal" and "abnormal" here, it's simply about making music together which is our stated aim.

At the moment, however, it is still a matter of developing the Kaleidofon further for Katharina, so that she has more classes available that she can trigger and that we work on the output sound. It should also be said that at the moment it is a project in which we are mainly investing ourselves because we believe in it. We are also currently looking for sponsors and funding sources to finance further development and are happy to receive any information or support.
Through one of our other AI projects, the "artificially intelligent muse" with Roman Lipski, which is about using AI to understand inspiration in painting, I learned the most important thing: The point is to persevere. Don't always just quickly develop a prototype and if it doesn't work right away the first time or nobody pays for it, just throw it away. That is not sustainable and I don't believe in that. Of course, not everything is black and white, you also have to know when to stop and when to take action elsewhere. But especially with these projects that are really just starting out, where we are still doing pioneering work, it is very important from my point of view to keep at it, and in case of doubt, to accept if something is left undone for a while. At some point, just pick up the thread again, find the right moment and meet the right people at the right time and suddenly completely different things are possible again.

Is this also something that you would generally recommend to young AI companies, that is, to keep at it and chase after it?

Absolutely! But as I said, it's not black and white either, so on the other hand it's also wise to know when to stop. Especially when it comes to decisions, it often helps to sleep on it for a night and then decide and, if in doubt, change your mind again two days later. A "flexible mindset" is very crucial in my view. Don't dwell on things – "kill your darlings" holds true here, too, so also be prepared to give up these very things, these darlings. But in the field of AI and super-innovative areas, I have found that staying the course often really pays off.

Are there any other subjects or projects you are working on at Birds on Mars that you would like to highlight?

The Kaleidofon is rooted in a space we call sol. Sol is the name of a day on Mars, which has 40 minutes more than a day on Earth. We all commit to these 40 minutes every day to work on new approaches to AI that do not have any direct external funding for the time being, but that we ourselves believe in and that can then give rise to new things, including economic ones. We are currently working on several exciting topics in this sol. In general, we are currently doing a lot of work in the field of AI and music, and we have a very exciting collaboration planned, but unfortunately I am not allowed to talk about it at the moment. However, I can tell you this much: we are working with a music collective to use generative AI to generate new interactive soundscapes. For more information on our many ongoing sol projects, visit birdsonmars.com/sol.

Outside of our sol space, we have a number of exciting and important projects with our clients; because it is also important for us to work on things that we are proud of and that are important and valuable in our own core time, that is, during the time we earn our money. This is the only way to remain a sustainable and long-term enterprise.
An example: We have just brought a project live with our client Deutsche Bahn, which involves training a model based on internal and public data that makes it possible to predict when and where which parking space will be occupied and how. This helps to save resources, since otherwise each individual parking space would have had to be successively further equipped with expensive, complex and high-maintenance sensor technology. We use individual sensors, so to speak, to calibrate the machine learning models, but in principle we are able to make area-wide predictions and determine the current status of the utilisation of parking spaces. This is one component of the mobility of the future, as users want to know in good time where it is full and how full it is, and where there may be bottlenecks, in order to be able to plan individual transport in the best possible way.

Another project we are working on is in the fields of sustainability and climate protection. We are very active in this area because we believe that AI can make a major contribution in the area of climate protection and climate adaptation. We are implementing the project together with the foundation Technologiestiftung Berlin in the CityLab. CityLab is committed to using civic innovation – that is, (open) data, technologies and hardware – to develop new solutions for the city, public authorities and, ultimately, us residents that make life in the city more attractive and easier. This also includes the topic of sustainability. Here, in cooperation with the city of Berlin, we are in the process of implementing the project QTrees – Quantified Trees (can be reached at www.qtrees.ai) – which aims to predict which trees in the city are most in need of water. To do this, we collect open data from the clouds to the root, calculate shading and use already installed sensors and their data. In other words, we are building a very unique data set and combining all the existing data to predict which tree needs water the most. At the moment, watering is literally done by the watering-can principle, and since water is becoming increasingly scarce, even here on site, QTrees is intended to make a contribution here. Ideally, if it works for the city of Berlin, the model could also be used for other cities in the future.

How do you view developments in the field of artificial intelligence in Berlin in general?

One can clearly say that a lot is in motion right now, some initiatives and associations, such as the KI Park, KI Bundesverband or Bitkom. And in the area of engagement, as well as through Berlin Partner, it is clear that a lot is happening, that it is being seen and that attempts are being made to support it in various places. The companies of this world are also asking for it more and more. However, I also have to say that many companies that really invest money are not from Berlin, but rather from North Rhine-Westphalia or southern Germany – that is my subjective perception from business operations. I also have to say that small and medium-sized enterprises are still lagging behind and also do not have the investment volumes that are needed when it comes to AI development. To do this, real money has to be spent, and that is not always easy in the current climate, among other things. Still, I sense a clear demand. However, political support is still very "typically German", with endless hurdles and in some cases simply not designed for business-oriented companies. In the end, the usual suspects always win. For many smaller business enterprises, there are often still too many hurdles to getting start-up investment for new projects. This is something that I think needs to be addressed. Young companies that are just starting out are often not in a position to benefit at all. I think this has to be de-bureaucratised for the most part, otherwise we won't be able to manage these innovation projects. If I as a manager were really only interested in profits, a project like the Kaleidofon would not exist.

So in order to establish Berlin as an AI location even further, you would need to update this funding sector?

In the objective systems – exactly. If you understand economics like an AI that has an optimisation task, then this objective function must be programmed differently. It has to be much more about sustainability and participation, about creation. Berlin is THE location where creative people are. People move here from all over the world because it is the place-to-be, on so many levels. You have to play to this strength and combine the topic of AI and creativity for this reason as well. This requires money, courage, openness and the will and space to just get going.