Divinity video game from Larian - may use AI trained on own assets and to help development

I applaud you for your ironclad adherence to the idea that there should never be a disruptive technological advance of any type, because the only metric you use is those immediately displaced and anything non-zero is unacceptable.

I hope you put your money where your mouth is and aren't wearing any garments woven on a loom, aren't living in a house made with mass produced materials. I hope you raise or barter for all your own food, and do so without disruptions to agriculture and animal husbandry that allowed one family to do the work that previous took the work of many.

This isn't a strawman, history is a cascade of new ways to do things that freed up people. From hunters and gatherers where we spent every waking hour trying to find enough calories to survive, with an average lifespan much shorter than now. Agriculture, animal power, bronze, iron, steel.

Do I feel for those displaced? Without a doubt. But I don't ignore all of those that were helped by disruptive technologies either.

You live in society, curious.
 

log in or register to remove this ad

Thank you for the definition. I have a huge issue with the ethical sourcing of data used for training of models. And none of the general LLM or generative art models are up to snuff for me. That however does not mean that the sources are inherent in the tool. There is nothing inherent that all sources must be stolen.

In this article Larian talks about if they use generative AI for art, it will be trained on only data that they own. There's no "largescale theft" going on. It does not disrupt the artists without compensation.

There is a robotics researcher that trained an AI on videos where they hired people to come in and fold laundry, which can have massive variation based on identifying types of clothes, size, etc. Because they were working on a general-purpose humanoid helper concept. I think it would be hard to show "largescale theft and waste, and it disrupts without compensating those disrupted", but you need to be able to if you want to claim that it's inherent.

You mention waste, do you know that spending an hour watching Youtube or playing a videogame will have a much larger environmental impact then spending the same hour chatting with an LLM?


The same thing can be said of search engines, but they've been a valuable tool for decades.

You're trying to judge it against an expert creating something. That's far from the only uses of the tool of generative AI. I talked in a recent post about a friend who uses it to prototype code. He's a skilled professional, uses it like as if he was doing pair programming, and is able to discard bad paths and identify good ones to explore much quicker than without it. He's the person creating things, it's the tool that's helping him do it.


Sure. Again, that's not the tool, that's the use that someone put the tool to. It isn't inherent.

Again, I agree with you that the current crop of LLMs and generative art have horrible issues of unethical sourcing of training material. And while I talk of friends who use it professional I don't because of that. But that's because of the choices of the companies that trained the models, not anything inherent to generative AI as a tool.
I cant even find a coherent argument here. I am done.

You have a good one.
 

side: It's bad period. End of dicussion.
Other side" It's not all bad"
200.gif
 

So it looks like they will be using AI to quickly iterate to try things out and refine ideas, and if they use generative AI for art it will be trained only on assets they own.

I know there's been a lot of talk about what the studio that made Baldur's Gate 3 will do next, what do you think of this?
I think this is good news, and I think it's a good start. I hope others follow suit, and that it starts bending the trajectory of generative AI use toward assets that are owned/legally acquired, and away from assets that were stolen.
 
Last edited:

A more topical concern I have with using genAI "Good enough" placeholders is that it might turn art direction into the sort of issue we have with temp music in movies. That is, the management / director gets so used to the placeholder art that instead of giving the art team the freedom to get creative and make stuff that best fits the final themes, they wind up having to work off the placeholder art (and the direction might walk them back towards it even unconsciously). And then of course you start having people (maybe not Larian, but others!) asking "wait, why do we have such a large art team..."

This tendency was illustrated to great effect in this classic Every Frame a Painting video about temp music, well before AI was a thing:

 

This is often most obvious in AI generated code, or technical writing - the coding task is completed quickly, but the code is fragile, or difficult to maintain, and breaks later, increasing cost in fixing regression defects than it originally saved.
Yeah I saw this at work recently - colleague showed me an GenAI-driven workflow that basically went to a website, looked stuff up, and reported back.

Three problems though:

1) It was incredibly slow. A human could do the same task in literally 30 seconds. Literally. This GenAI-driven bot took 10+ minutes. Sure 10 unattended minutes, but god knows how much power it must be eating up sitting there considering every web page for minutes.

2) Any time the website changed at all, it got confused, and the website changed a bit every few weeks.

3) It gave back incorrect results 100% of the time. This is the real killer. He was so impressed with the bot and I was like, but is that right? That doesn't look right. And he agreed, no, it isn't bringing back the correct info, but if you, a human, spend some time sorting through the info it's brought back, you'll find it's mixed in there. And how long does that take? About 30 seconds... bloody hell.

So we've built this elaborate bot to bring us back a bunch of junk info really slowly so we can then look through the junk to find the real info, which would take about as long as just looking up the real info...
 

Yeah I saw this at work recently - colleague showed me an GenAI-driven workflow that basically went to a website, looked stuff up, and reported back. Three problems though: 1) It was incredibly slow. A human could do the same task in literally 30 seconds. Literally. This GenAI-driven bot took 10+ minutes. Sure 10 unattended minutes, but god knows how much power it must be eating up sitting there considering every web page for minutes. 2) Any time the website changed at all, it got confused, and the website changed a bit every few weeks. 3) It gave back incorrect results 100% of the time. This is the real killer. He was so impressed with the bot and I was like, but is that right? That doesn't look right. And he agreed, no, it isn't bringing back the correct info, but if you, a human, spend some time sorting through the info it's brought back, you'll find it's mixed in there. And how long does that take? About 30 seconds... bloody hell. So we've built this elaborate bot to bring us back a bunch of junk info really slowly so we can then look through the junk to find the real info, which would take about as long as just looking up the real info... Since you mentioned power usage for all that wasted time AI could account for nearly half of datacentre power usage ‘by end of year’ And then this " Yeah I saw this at work recently - colleague showed me an GenAI-driven workflow that basically went to a website, looked stuff up, and reported back. Three problems though: 1) It was incredibly slow. A human could do the same task in literally 30 seconds. Literally. This GenAI-driven bot took 10+ minutes. Sure 10 unattended minutes, but god knows how much power it must be eating up sitting there considering every web page for minutes. 2) Any time the website changed at all, it got confused, and the website changed a bit every few weeks. 3) It gave back incorrect results 100% of the time. This is the real killer. He was so impressed with the bot and I was like, but is that right? That doesn't look right. And he agreed, no, it isn't bringing back the correct info, but if you, a human, spend some time sorting through the info it's brought back, you'll find it's mixed in there. And how long does that take? About 30 seconds... bloody hell. So we've built this elaborate bot to bring us back a bunch of junk info really slowly so we can then look through the junk to find the real info, which would take about as long as just looking up the real info... Since you mentioned power usage for all that wasted time AI could account for nearly half of datacentre power usage ‘by end of year’ And then this " This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Yeah I saw this at work recently - colleague showed me an GenAI-driven workflow that basically went to a website, looked stuff up, and reported back. Three problems though: 1) It was incredibly slow. A human could do the same task in literally 30 seconds. Literally. This GenAI-driven bot took 10+ minutes. Sure 10 unattended minutes, but god knows how much power it must be eating up sitting there considering every web page for minutes. 2) Any time the website changed at all, it got confused, and the website changed a bit every few weeks. 3) It gave back incorrect results 100% of the time. This is the real killer. He was so impressed with the bot and I was like, but is that right? That doesn't look right. And he agreed, no, it isn't bringing back the correct info, but if you, a human, spend some time sorting through the info it's brought back, you'll find it's mixed in there. And how long does that take? About 30 seconds... bloody hell. So we've built this elaborate bot to bring us back a bunch of junk info really slowly so we can then look through the junk to find the real info, which would take about as long as just looking up the real info... Since you mentioned power usage for all that wasted time AI could account for nearly half of datacentre power usage ‘by end of year’



And then this



" This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers"


So...way more than looking up the information would have taken. so much so that Gen AI services are causing a massive increase in total power usage.
 

I work for a healthcare provider to use AI and ML solutions to improve health care. We are a not for profit so our only motivation is patient care while not losing money.

We have developed or used multiple GenAI tools, researched costs and quality and run year long trials. There are multiple areas where AI solutions are of comparable quality to human-based solutions, improve both patient experience and reduce provider “overhead” time, and require little power.

Although they use general models that have been trained on general data, the data we actually use are public domain, so the non-public domain data is essentially grammar and general English comprehension. We do not use image or video generation.

You can argue that this is a niche, specialized market (but do look up the % of the US GDP spent on health care first), but arguments that it is worse quality than humans, reducing quality of life for anyone, or costing significant energy are just plain wrong.

I honestly do not have the time to address all of these specific point (honestly, it’s just a lot!), but if there is ONE specific point you really think you’d like more info on, like energy calculations, quality/hallucinations, improving patient experience, improving provider experience — I’ll give info on that. M post likely I will use an example that I am pretty sure most of you will find your doctor using within a year or so.

I will not address the established fact that GenAI tools can be used to reproduce text and images that I personally would consider unethically close to owned content. Those use cases are not ones we use — essentially we don’t ever look for true “creative” content: all our use cases transform content we own into new forms using LLMs rather than having LLMs create content.

As I have argued before, i believe the non-creative “content transformation” uses of LLMs are significant, and will sustain the industry (possibly at lower levels than now) and be overall beneficial.

Writing stories, evaluating complex Concepts, creating art — yeah, that’s mostly trash and hard to see how current tech can do much better.
 
Last edited:

Domain specific cases where it's genuinely handling non-creative work, surfacing stuff that might otherwise have been missed, or intelligently automating workflows are all great uses of "AI" (generic).

Where I think a lot of us have a huge issue is with it replacing human creativity and thought. The running joke is that we were told AI would some day take care of the laundry so we'd have more time for creative output, and instead it seems to be taking care of the creativity so we have more time for laundry (manual labor).
 

Enchanted Trinkets Complete

Remove ads

Top