AI in Gaming (a Poll) [+]

In your opinion, what are some acceptable uses of AI in the gaming industry?

  • AI-generated images (book art, marketing, video game textures, board games, etc.) are acceptable.

    Votes: 24 33.8%
  • AI-generated 3d models (for video games and VTTs), are acceptable.

    Votes: 22 31.0%
  • AI-generated writing (books, ad copy, descriptions, etc) is acceptable.

    Votes: 14 19.7%
  • Adaptive dialogue (for NPCs in video games and VTTs) is acceptable.

    Votes: 31 43.7%
  • Adaptive damage/difficulty (the game adjusts difficulty to your level, for example) is acceptable.

    Votes: 35 49.3%
  • Adaptive behaviors (NPCs, enemies, etc. react and change their tactics) is acceptable

    Votes: 45 63.4%
  • Procedurally-generated maps (dungeon generators, rouge-like game levels) are acceptable.

    Votes: 51 71.8%
  • Procedurally-generated challenges (traps, monsters, whole encounters) are acceptable.

    Votes: 43 60.6%
  • Procedurally-generated rewards (item drops, random treasures) are acceptable.

    Votes: 43 60.6%
  • Other acceptable use(s), see below.

    Votes: 8 11.3%
  • There are no acceptable uses of AI.

    Votes: 16 22.5%

CleverNickName

Limit Break Dancing
I think it's relevant, but sure.

Technology disrupts labor, even artistic labor, and always has. It's a thing that is true.

We absolutely need guard rails for generative AI, for a lot of reasons: social, political, economic and yes artistic. But that doesn't mean it isn't a) inevitable and b) in the end a tool that produces a net good.
~sigh I'm trying really hard to stay on topic here, but you struck a chord.

My area of expertise is water engineering, and I can state for a fact that not every new technology "produces a net good." We have invented things presumably for the betterment of humanity that that we truly wish we could now un-invent. (In my specific industry, DDT and PFAS come to mind.) It's just something to keep in mind when you think about the inevitability of new technology.

Anyway. Sorry I got off topic.
 

log in or register to remove this ad

Reynard

Legend
~sigh I'm trying really hard to stay on topic here, but you struck a chord.

My area of expertise is water engineering, and I can state for a fact that not every new technology "produces a net good." We have invented things presumably for the betterment of humanity that that we truly wish we could now un-invent. (In my specific industry, DDT and PFAS come to mind.) It's just something to keep in mind when you think about the inevitability of new technology.

Anyway. Sorry I got off topic.
I wasn't speaking generally that all new tech is a net good. I was specifically saying that I believe the current crop of AI (which goes beyond generative AI) will be a net good once we figure out how to wrangle them.

As a civil engineer myself, though, I feel you.
 

Celebrim

Legend
The IBM Watson Explorer AI replaced 34 workers at the Fukoku Mutal Life Insurance company back in 2017. The AI replaced employees whose job it was to calculate payouts to policyholders which, at the time, was about 132,000 that year alone. These weren't low value employees doing unnecessary work. Now I'm not going to say that this means it's curtains for humans. Machines are supposed to make us more efficient after all. But AI is going to transform a lot of job sectors and while it may be a net positive there will be short term costs.

I'm a software developer and yeah, we think about this sort of stuff all the time. My job is to take someone else's work, automate it, and ultimately reduce company costs by reducing labor costs. That means if you hire me, you're planning on losing a number of positions greater than my salary.

I have designed a system a lot like the one you are describing. The company was trying to do it using emails and excel spreadsheets and it was making annually six figure errors in the amount it paid out. It was paying people the wrong amounts so that some people weren't getting what they deserved. It was overpaying people greater than they were owed. "Low value" is very different than "low salary". You can have a very high salary and be a very low value employee. Turns out that certain kinds of repetitive mathematical calculations that depend large numbers of factors are very likely to produce large amounts of human error and unnecessary waste, to say nothing of unnecessary costs and poor customer service.

I don't feel bad when I fix problems like that. I do however worry what is going to happen when we can do most jobs better with a machine than we can with a person. Like I worried a lot more about the $10 an hour call center workers jobs when I was automating them away than I did about the guys with master's in business administration and finance. Industrialization after the initial economic disruption was ultimately of great economic benefit to the lower class. But the death of industrialization in developed nations has left no new economic prospects in its place, and I'm not sure this new round of automation is going to leave obvious answers for what a guy with 95 IQ is going to do either. I will say this though. From a purely computational perspective, a robot maid is a heck of a lot smarter than a robot artist.
 

Blue

Ravenous Bugblatter Beast of Traal
I'm 100% fine with any and all AI tools. As long as the models are ethically sourced.

So you have an retro alien generator trained on images from before 1923 that are all in the public domain, go for it. If you've hired writers to create your setting bible, and then used that to train chat AI, great!

Just like painters needed to understand their paints, transparencies, brush techniques, mixing, blending, perspective, anatomy, foreshortening and a host of other skills and then came around the camera where all you needed to do was be in the right place and light+compose the shot (or, today, write a prompt), there is room for AI tools and artists. Just like when cameras came, it's not going to be the same as it was before, but it's not an extinction event.
 

Reynard

Legend
I'm 100% fine with any and all AI tools. As long as the models are ethically sourced.

So you have an retro alien generator trained on images from before 1923 that are all in the public domain, go for it. If you've hired writers to create your setting bible, and then used that to train chat AI, great!

Just like painters needed to understand their paints, transparencies, brush techniques, mixing, blending, perspective, anatomy, foreshortening and a host of other skills and then came around the camera where all you needed to do was be in the right place and light+compose the shot (or, today, write a prompt), there is room for AI tools and artists. Just like when cameras came, it's not going to be the same as it was before, but it's not an extinction event.
I think one of the fuzzy issues is going to be "retroactively ethical."

By that I mean things like (just as an example, I don't know what they are actually doing) Deviant Art. If that site makes a deal with Midjourney and then changes its terms of service so that to continue using the site you have to grant permission to allow DA to let MJ use your art to train, is that "ethical sourcing"?

Remember: I am on the side of this stuff ultimately being a net good, so I am not trying to throw wrenches at the works. Even so, it is something to think about.
 

Blue

Ravenous Bugblatter Beast of Traal
I think one of the fuzzy issues is going to be "retroactively ethical."

By that I mean things like (just as an example, I don't know what they are actually doing) Deviant Art. If that site makes a deal with Midjourney and then changes its terms of service so that to continue using the site you have to grant permission to allow DA to let MJ use your art to train, is that "ethical sourcing"?

Remember: I am on the side of this stuff ultimately being a net good, so I am not trying to throw wrenches at the works. Even so, it is something to think about.
I don't think that's even a theoretical issue. For chat-based AI, I believe Reddit and some others have already done so. Reddit's big scandal last year with the pricing that made a lot of subreddits go dark was about pricing changes that would hit a lot of things, from accessibility apps to the big three (four?) 3rd party browsing apps. And a big part was that they wanted to monetize all the scraping that OpenAI and the others were doing about what people wrote. Allowing use for AI has been in the T&C for a while IIRC.

Adobe has issues where artists have found AI-generated stock Adobe is selling (at up to $80 each), but where the artists haven't opted in.

There's a lot about ethically sourcing, and it'll be a big deal for years to come. EU I believe is working on laws, and there's talk about the existing copyright laws in the US, but it needs to be nailed down.

And if an artist doesn't want their work submitted to AI from DeviantArt changing T&C, perhaps that will spawn non-AI portfolio sites as a backlash.
 

Reynard

Legend
Is what you write on a message board yours? Or does the ToS hand over those rights to the operator of the board? This is important in discussing training AI because, frankly, there is no better way to train AI on how people communicate tan with actual textual examples of that.

But, do I own my posts on ENWORLD? F not, what happens if I decide to sell a product based on my post(s)?
 

Celebrim

Legend
But, do I own my posts on ENWORLD? F not, what happens if I decide to sell a product based on my post(s)?

No, you lose rights to your work as soon as you post it on EnWorld. I don't know what the EnWorld terms and conditions state, but I have been on forums that explicitly had in their terms that whatever you posted to their forum became the intellectual property of the site owner. In those cases, if you published based on your posts at those forums then the site owners could sue you. You'll also find that having posted at EnWorld tends to break the terms publishers will demand of right of first publishing meaning they'll refuse to publish a work that is heavily based on posts you've made here. I don't think that's true of EnWorld, or at least it wasn't, because they once asked me for permission to publish a post I'd made here, but to be honest I haven't read the terms and conditions in a long time here.

That leads into something I've been wondering for a long time which is the reason the overall quality of the posts have EnWorld have gone down over the last 20 years is that I think the vast majority of posters are holding back any good stuff now because self-publishing has become such a big thing in the table top RPG community. We're no longer the "potlatch" community we used to be, because it's too easy to commercialize your own content. I know that in general I now only post things that I couldn't claim copy right over in the first place because they are derivative of other IP, and that I've noticed not only are fewer people posting content generally but there aren't even people asking for content assistance.
 

Celebrim

Legend
There's a lot about ethically sourcing, and it'll be a big deal for years to come. EU I believe is working on laws, and there's talk about the existing copyright laws in the US, but it needs to be nailed down.

Existing laws I think are all on the side of the AI scrapers who will I think under current law be able to claim (in the USA at least) fair use. But I do expect that in the near future a lot of thought will be belatedly put into the subject and we'll see some new legal protections put in place. The little guys complaining now won't be the driving force, and I wouldn't be surprised if the laws aren't really written to protect the small content providers but I do expect something to happen.

But really, that's just the tip of the iceberg. We're about to find we need a whole raft of new legal ideas.

Oddly, codifying all this into law is likely to accelerate the AI revolution. One of the things that has been holding us back since like the 1990s is ambiguities about legal liability. If Congress stops being useless for a change and actually writes laws that are needed it will greatly ease the market and investor worries about creating AI.

Mind you, I don't expect the first round of laws to be all that good. Based on public reaction to AI I expect that they are sacred, easily manipulated, and don't have a clue what is in their own interests when it comes to AI. The heavy hitters are going to want the law written in such a way that it protects their interests and profits. They could care less about fundamental issues like whether an intelligent agent is loyal to the purchaser or to the seller.
 

aramis erak

Legend
I don't feel bad when I fix problems like that. I do however worry what is going to happen when we can do most jobs better with a machine than we can with a person.
The thing is, most manufacturing jobs can be done more accurately and faster by automated processes already... but as yet, not at the same costs as a Chinese or Indian national in a sweatshop... even after defect rates.

When the machines' costs come down, Even the wage slaves of the most populous nations will be looking for new work... but little will be left, save feeding the materials into the machine. And even that can wind up automated.
 

Remove ads

Top