I guess I'm an anti, though I don't like the term. My main problem is with generative AI. Though other forms of AI and the basis of AI technology, machine learning, neural networks, etc., I don't have a problem with, necessarily. They are just technology.
Yet, I do realize, upon reflection, especially after joining this community, that my internal bias can be irrational at times. Not every single thing "AI" is bad, but the the idea that something might have a little AI in it feels like it's tainted or something.
Regardless of serious concerns about AI, this kind of puritanical thinking, I don't think is healthy at all.
What made me confront this was discovering an audio plugin that I was considering buying, had ben made using machine learning.
I am in no way affiliated with this company and I haven't even bought the plugin yet (becuase of a weird sense of apprehension), but last year sometimes (for the audioengineering nerds) Arturia released an emulation of the Studer J-37, tube driven tape machine.
For everybody else, this is the tape machine that The Beatles used to record a lot of their music. There is already an emulation of this tape machine, but this one blows it out of the water. I downloaded the demo, and it imparts a beautiful character onto music.
The developers took a real machine from a studio in France, and meticulously neurally "mapped" the non-linearities that the machine imparts on sound and trained their plugin to basically imitate the machine. Previous tape plugins have used physical modeling. Now, I don't really know anything about how this works, but when I found that out, I had kind of had an emotional reaction of just dissapointment.
But on the other hand, I really have no reason to have that reaction. I fully realize that it is irrational. They didn't take anybody's data without consent. They aren't trying to replace the tape machine. That's already been done for a while lol. They were just using new techniques to "restore" something old as balls so that it is accessible to the rest of the world. (And obviously make money, but that's neither here nor there). The plugin does not generate music, or do any real heavy lifting. If you put shitty music into it, the music is still going to sound shitty. It's a subtle tape effect.
Pros, feel free to chime in. But Antis, what do you think of this kind of a specific application of non-generative AI technology, like machine learning, and neural networks, in creative and artistic tools?
While, I have serious problems with certain uses and applications of AI, especially generative, I think it's important to confront internal biases and realize where they are coming from and ask ourselves if they are actually useful.
If I buy this plugin, am I sacrificing a little bit of my creativity and tainting my music, or am I allowing myself to apply a piece of musical history to my own art in a new and exciting way? I don't know. Am I just way overthinking this? Help me out here. lol
To make you feel better, here is a list of applications of AI:
https://en.wikipedia.org/wiki/Applications_of_artificial_intelligence
You have probably been using AI in one way or another for your entire life, often without knowing it. Currently, every company uses the AI label for everything because it's the hype right now.
So yes, you are overthinking.
Thanks. That's a good list. I don't have a problem with most applications of AI, especially if they are benefitting everybody.
Most things don't benefit everyone. Take, for example, AI-driven translation that drove the final nail in a once respected and well-compensated profession. The society as a whole benefits from this advancement tremendously, but at what expense? Look at other applications of AI - there is almost always someone who suffers. How do you decide which applications are beneficial enough and which ones are not, based on whether they affect you and yours?
I'd say satisfying literally everybody is not possible. Everything that makes something easier or automates steps can mean that you need less manpower somewhere in the process, or that you need a different kind of knowledge, or that other products aren't needed anymore. Some things happen for the greater good, but individuals or even whole industries can still suffer.
A non-AI example would be electric cars. Electric cars are a net positive for society as a whole, at least compared to cars with combustion engines. But here in Germany, that means that a whole industry is facing problems. The car manufacturers themselves, because they simply move too slowly, but also many companies that create individual parts needed for combustion engine cars.
Saying "everybody" was my mistake. "Net positive" is a much better way to say it.
This sounds like an amazing tool, and just because they slap the AI label on it doesn't mean it's the "bad" AI that many are freaking out over. I really don't see a single thing here that could be considered unethical. This is definitely the knee jerk bias you're talking about, as this is basically just a remade old technology.
The funny thing is, the company doesn't even advertise it as "AI", but specifcally as machine learning.
This is a really good use case for NN based models. I'm convinced the hardline anti has more people straw manning against it than actual humans thinking that way though.
I think so. It's actually kind of inspiring how something so specific and physical can be "cloned" in a sense, and then I can just download it.
In a very real but indirect way, the actual physical tape from an old machine in France is reaching through time and cyberspace to impart it's quality onto people's music, in a way that was not possible before. Less of an imitation, and more of an imprint.
You're not overthinking it, youre just thinking which is always a good.
Sure, it means youre not getting a pure sound from your music but how critical is that to the integrity of what you produce? If you use authentic or rare instruments, played live for their specific sound value then it would undermine the point to filter like you suggest. Otherwise it's more of a grey area. Uploading anything almost anywhere publically, cannot escape an AI taint to some degree, whether it's compression or moderation tools or any other background processes that interact with your content in anyway.
Thats the specifics, but you need to zoom out for a broader view. The big 7 are financial megaliths, they hold @%50 of the value of the entire US stock market. There is no way to engage in the modern world and entirely untangle yourself from that kind of scale of investment. Every time you spend money in the economy, even completely off the books, you are almost innevitably, someway down the line contributing to the firms that create AI or invest heavily to support it.
You cant buy anything without getting blood on your hands, whether it's orangutans losing habitat, fish poisoned by plastics or corals dying in acidified oceans, amazon deforestation for beef or soya, all the strategic resources, heavy/rare metals in your devices, and in the GPUs in all the datacentres, mined in appalling conditons for workers and to the environment, and funding despotic regimes who impoverish the majority and persecute minorities... you just cant escape it.
AI scanned your post on here, and will scrape it repeatedly for training data many times over.
There is no escaping your own small share of culpability or hypocrisy, it's baked into the system. You either let it crush your opinions or hold to your principles while acknowledging that we're not perfect creatures, and are made much worse and more harmful by the systems that control us. You cant remove the speck from your eye, you cant expect others to like it, and you can still point out the plank in others'. Globalisation dictates these terms. Any other approach is sheeplike acquiescence, or absolute, total withdrawal, which amounts to falling silent just the same.
'I use AI whether I like it or not, and I wish it didnt exist' is a growing reality for a lot of people.
AI winter is coming. Again. Dont sweat it in the meantime, you never asked for it to be like this.
It wouldn't be any more pure without it, or less pure with it. It doesn't drastically change the sound in any way more than any other plugin, AI or not. And it's not actively using AI in real time. An "AI" methodolgy was just used in it's development.
Something that I've said before, particularly in music realated subs, regarding AI tools, is:
If it is not meaningfully changing your workflow, or doing heavy lifting, then it's not something you need to worry about.
But, I do get something from this tool that is unavailable elsewhere, accept perhaps, a real Studer J-37, which is inaccessible to me.
And that's where I'm questioning my internal conflict. Am I apprehensive becuase it gives me something or makes something "easier" that otherwise is essentially impossible? Is it guilt that I'm feeling? Imposter syndrome? Or, is it just the "AI" that I don't like. I feel like it might be more of the former.
For all intents and purposes, as far as I'm concerned, the integrity of my music would remain with or without it, becuase the difference is so subtle as to not be noticeable to most people.
Weirdly enough, I’ve been doing this album on real tape and it changes my workflow 10x more than any plugin would. Everything has to be perfect before you start basically
I believe it!
I see. There's no intrinsic virtue in hard work or abstainence. But theres no virtue without them. This can create a lasting sense of guilt when taking the easy route, but if youve already walked the hard miles then are you just being a martyr to a set of basic principles you've now outgrown?
If the ease of using a given tool is disproportionate to the difficulty of the rest of the creation, then it may create a sense of wasted time spent on the rest, and spoil it all for you. Ultimately, if it feels wrong, and that feeling doesnt pass, dont use it. That feeling itself will only poison your approach and the enjoyment and satisfaction that creation brings you. If you feel guilt, and part of that is fear of being discovered, then dont use it.
Ultimately, you can either embrace it as a tool of ease, like any other, or you can conscienciously object to it, and either know youre making things harder than they might need to be and come to peace with that, or make a virtue of it openly.
This is all very much an internal decision for you, but like I said before, there is no true purity to be found, no way to escape all forms or derivatives of AI right now, so I wouldnt sweat too much about fringe cases like this.
Just be prepared to front up about it, and whatever you do, dont be evasive or too proud either, about using it. Think about how you'd explain yourself if you were dragged over the coals for it, and if your explanation sounds weak and unsatisfactory, then .. dont use it.
Do you personally think that the same ethical implications that apply to generative AI, also apply to machine learning itself as a methodology in creating something?
I’m fine with non-generative ai, ai is that broad a term that you can shove a lot of things that anyone would be fine with under it (for example, video game enemies movements and attacks). I think most antis agree with me when I say that we aren’t “anti ai” we’re anti gen ai.
Ai has been used in art for a long time, hell video games can have it as a core feature (example being enemy movements), then there’s vocaloids, and a plethora of different things that might come under “ai” but that everyone is fine with.