• This is r/thathappened material. It's just 4chan greentext.

    I'm no fan of "AI" but this is just written to indulge whatever we already wanted to think was happening in boardrooms. It's cathartic to imagine I guess.

    Edit: I guess the source was more clearly satire. It definitely works for that.

    I just think it's a fun short fictional story told in an easily digestible format; and yep, just like the 2% of 4chan green text that gets reposted.

    I thought it was pretty well done so I posted it here. :) If not everyone agrees, I will as happily take downdoots with the (hopefully nonzero) updoots....

    It's "install Adobe reader" repurposed.

    Probably ai slop itself. Useless either way tbh

    We don't take kindly to fiction around here. Everything must be Real Life, because we prefer laughing AT people instead of WITH them.

    Its a poem.

    They even sign it at the end.

    The details are made up, but the narrative is true.

    The facts may be a lie, but the story isn't

    • Stephen King

    The story you are about to see is a fib, but it's short. The names are made up but the problems are real.

    • Mathnet

    Agreed. For a "positive" comparison point, my employer is rolling out a new computer system to obsolete like a dozen decrepit ones that don't talk to each other. It's a massive chore, but an ultimately important one for the business. 

    In the corporate training slides, there's one reference to how the new system will be "better for AI." Horseshit. The new system has no AI integration whatsoever. However, I'm sure that slide sold someone higher up on spending the necessary money to make upgrade because we needed to sell ourselves as modernizing with AI to investors.

    At work we use digital scales to measure specimen. they are connected to simple laptop each to transfer that data to a database. These laptops do nothing but that simple weighing. Still, like every computer in the factory, they were upgraded to W11, and got all the Ai bullshit that comes with. It is never used, but IT or anyone could state "we adopted ten more machines for AI" and it would technically be true.

    Genetically, paedophiles have more genes in common with crabs than they do with you and me. Now that is scientific fact. There's no real evidence for it, but it is scientific fact.

    You're talking nonce-sense

    Really? Because if that’s the case, the problem is not that AI doesn’t work, it’s that it’s being actively sabotaged. Piss away a bunch of money and do nothing substantial, then blame the AI, which you never even attempted to use.

    Actually, you’re right. The people who bitch about it the most are people who haven’t tried it, don’t understand how to make it benefit them, and don’t REALLY think about the subject longer than it takes to jump on the bandwagon. 

    I use AI frequently. My hairdresser friend gave me shit about the data centers and their power/water consumption…… someone who washes chemicals down the drain while running a hairdryer ALL DAY thinks that I’m the problem.

    The only part of that entire post that implied any criticism of AI was the bit about it taking 45 seconds to summarise an email you could read in 30 seconds, and it’s clearly an exaggeration.

    If you like AI so much, maybe you should get AI to summarise Reddit posts for you and maybe you will do a better job of understanding them.

    The scale of the water usage is not the same, not to mention you are participating in wasting that water as well, not just your hairdresser.

    You’re right, she uses WAY more.

    And yeah, I’m participating in water usage. But being scolded by someone who doesn’t even recognize that their usage dwarfs mine, is stupid. 

    I used to have a couple small businesses that AI would have been useful for. Now I'm in the trades and it's ruining my life.

    It's letting a bunch of people (managers, engineers) who don't know what they're talking about act like they do. And they love saying it's gonna take my job but...I really don't see it happening. I see it taking their jobs, if anything.

    I've been a machinist for awhile now, and every new technology is supposed to take my job. It never does though.

    I also think chatbot AI is just a stupid tool for us to have. It does things people could do, and uses resources to do it. We need to spend resources on people anyway, we don't need to spend it on frivolous things like chat bots. It's good for business and bad for the world.

    You're a machinist, and you feel safe right now? Wow dude. If you are manually machining, its a wonder you still have a job. If you are running a cnc or programming gcode or something, you won't for much longer.

    I've been a tool and die maker, heavy machinist, oilfield machinist, production, prototyping, etc. I've also been the guy who repairs machines when they break down, and I've been in charge of production on a shop floor.

    I'm the guy, when it comes to machining. I'm Him.

    And i don't see AI taking my job. I'm not even a little worried about it. There's a lot more to it than people think.

    Yeah, anyone who talks like that is definitively NOT “him”

    I've been a machinist for awhile now, and every new technology is supposed to take my job. It never does though.

    Most low skill machining has already been automated. But the next step of automation is incredibly difficult because things violently deconstruct themselves seemingly at random while machining.

    I felt that one deep in my corporate addled bones, brother.

    This is how they function. This is the level of stupid that goes down in there, when nobody knows what something does, and nobody dares to look stupid by asking.

    I do believe it. I'm experiencing it myself, if a bit downstream.

    Where I work, people are presenting their own work as entirely AI-generated just to feed success stories to upper management. If they say a project was done with minimal AI assistance, they'll get drilled on why it wasn't done 10x faster. If they say it was entirely AI, management takes turns celebrating themselves.

    They have zero understanding of the size of any task, so they just assume anything coded by humans could've been done by AI in 1/100th the time.

    I’m glad I’m an accountant working in cross border tax and compliance. It’s so easy to explain that AI is far to risky for the type of work that I do and that it’s only real value to me is to be used as a search engine (and immediately going to the references) if google fails

    I'm an exec in a tech Mag7 and this is flawless. I don't think I could have written it more realistically.

    Editing this from Mag10 to Mag7 was a fantastic touch

    Mag10 is the index

    Eh, it may be made up, but it's satire that isn't too far off from the truth.

    AI is useless as fuck for anything but messing around.

    Sure, but I'm not confident people are getting that this is satire

    There is no satire so obvious that dumbasses won't mistake it for reality

    Yeah fair no doubt. No problem in pointing out that it is fake/exaggerated.

    AI does suck tho.

    Yeah the comment was so good I saved it when I saw it originally, but the title of this thread is awuful. It's not an explanation. It's just... good copypasta material.

    It sounds like the Google Ultron guy got a few promotions.

    Hah, thank you for reminding me about the legend

    It doesn't even really make sense. How on earth is paying Microsoft for useless copilot licenses supposed to benefit your own career advancement at some third party company? They're getting the rough outlines of the bullshit right, but this would make way more sense if they were pitching themselves as a Microsoft salesperson or something.

    Because you think people in large corporates have properly measured KPIs - they don’t. Ever watch office space?

    You also think that large corporates are incentivised to be efficient and productive - they aren’t.

    Plenty of books about this, from technofeudalism to bullshit jobs.

    Obviously a lot of stupid bullshit happens in big corporations, but the details here just aren't how this shit actually works.

    Some mid level executive doesn't just freelance an initiative like this into being then go on spending spree justified by just inventing completely made up figures that don't have anything to do with established ways of measuring these things — and in the process rocket up the hierarchy to SVP "by Q3". I mean... it's not like "productivity" is some kind of arcane KPI that you can just pretend your 10Xing and expect skeptical higher ups to just go along with because they're so mystified by the concept.

    It's not like they're fishing for ways to promote random subordinates, either. That's just not how the incentive structures actually function within hierarchies like this. Meanwhile, you're telling me literally only 1% of employees at a business that apparently does some kind of software development ever even tried it once? That sounds completely made up, and frankly, there actually are quite a few use cases for such people. This shit is a far cry for what's been promised, but every developer I know has found these tools to be genuinely useful for certain kinds of tasks. Also, since when does HR give a shit about what metrics are being used in a completely unrelated branch of the business? That ain't what they're there for.

    Again, the rough outlines of this line up with something kind of resembling plausibility, but it falls apart in the details.

    What really happens is that someone towards the top of the ladder tells someone below them, "figure out how we can use AI" and then that person goes and does whatever bullshit they need to do to implement it and then seem busy. The story would be much more convincing coming from someone who was told by a brain dead executive that they needed to be better about implementing AI and then started doing this kind of bullshit because they were made to find numbers showing it was working. This sort of chicanery works really well when you're trying to convince someone who actively wants to believe you, but not so much otherwise.

    Whats kpi?

    You could google that…or I can tell you that it’s “key performance indicator” and it’s a bullshit term used in corporate environments where someone wants a number they can point to as a reason they should get promoted because number went up.

    Oops. I guess I did all the work

    KPIs are not, in and of themselves, bullshit. When they are used as intended they are meant to be ways to track what is and isn't effective when it comes to organising companies or tackling processes.

    The problem is that most organisations move KPIs from being a measure to being the thing people are assessed against so something having a negative impact on KPIs risks your position, job, promotion, or benefits so people immediately begin to KPI-hack.

    My organisation moved from a CEO who viewed negative turns in KPIs as a thing to investigate collaboratively to try and understand, to one who is way more interested in stat-hacking and it drives me crazy.

    Well, those people who react that way are handling KPIs in a healthier way - too many people forget the map is not the territory. Or to paraphrase Goodhart, when a measure becomes a target, it ceases to be a good measure

    aka "You get what you measure" after some time

    When it stops becoming data that you use to inform your decisions, and starts becoming a metric you're managed against, it's ceased to be useful.

    The only experience I can really speak to on this is Net Promoter Score. The entire hospitality industry (and many, many others) are in hock to Net Promoter Score. If you use it as data it's great. If you start insisting that managers have to hit a certain percentage of 10s it becomes all about gaming your reviews to hit that score. You don't think about how to convert your 7s and 8s to 10s, you just get as many 10s as you can.

    A metric tracked is a metric targeted.

    Googling for abreviation often gives several results

    Come on man, let the human ask a human. Your response is better than google's anyway.

    Ever watch office space?

    Citing fiction to back up your argument isn't a good look...

    Spaceships definitely make sound in space. Ever seen Star Wars?

    That’s why I put my proper references at the bottom, Chicago style

    Except you didn't?

    They asked AI to do it

    Because AI.

    It's like when companies just started putting "block chain" in their name and the stock would go up. Doesn't matter if it does anything useful as long as it makes the line go up, and right now even the mention of AI in a company makes the line go up. Doesn't matter why

    That's why this would make more sense from a sales perspective. Buying a shitload of copilot licenses isn't pushing up anybody's share value, and if this was all just a hustle to juice the share price, it wouldn't be being spearheaded by some kind of mid level exec lying to deceive higher ups. The way this works in the real world is that the people at the top push these sorts of initiatives and then basically ask the folks lower down the chain to justify their decisions.

    Exactly. It's "We need to use more 'AI slurp' to keep up with the McGuffin Corp! We can't get behind!" --> VIP asks "How much 'AI slurp' is your department using?" --> Department heads go "We need to make #AI-stat go up in every area." b/c verbal 'tactics', axe-pressure, carrot-held-at-3MLY

    I mean, there may be some 'initiators' selling ideas to UMGMT and 'getting ahead of the monkey-copying curve'

    I could see it, honestly -- their career is advanced by convincing higher-ups that they're doing bold futuristic stuff, and spending money makes it seem real. As satire, I think it works.

    Where I work, people are actually misrepresenting their own work, saying it was done entirely by AI just to feed upper management narratives about AI success. If you say you didn't extensively use AI, they'll assume it could've been done 10x faster; if you say you did use AI, everyone will congratulate themselves on how smart they are for embracing the future.

    It's fucked.

    Genuinely, go read David Graeber's Bullshit Jobs. Companies are not rational actors, they consist of people in a hierarchical organization who have meat brains and have desires other than making money for the company. The feeling of being able to take credit for something important is itself a driving factor, the desire to have underlings, status, forcing those beneath you to comply. This is the combined with the inherent limitations of institutiosn to understand the wolrd, beurarcacies like a company can only really understand reality by simplifying and abstracting it and overly simplistic buzzwords and easily collected metrics give the institution the impression that it understands the world it is operating in.

    all (most) regular companies nominally are already profitable, or else they wouldn’t exist. as an up and rising exec the key is to “do a thing” to get promoted. now you don’t want to rock the boat and hurt revenue or profit numbers by messing with existing operations. solution? spend company money on “new initiatives”! 

    This is 100% how things in a corporate environment happen. The only thing c suite (CEO, CFO, CTO, etc) cares about is making money for investors. In a private company that's the board. In a public company that's shareholders. Nothing else that happens in the company matters other than the graph going up and to the right. Investments in AI make corporate boards and shareholders think the company is doing the "smart" thing, which keeps that line going up.

    The only thing middle managers care about is getting to the c suite, or not getting fired. This means - as others have said - it only really matters if you appear to do a good job, not if you actually make the company money or treat your employees well. This is exactly what would get someone like this promoted.

    Excuse me while I bail out with my $10M golden backpack

    Oh yeah, our poor company. So sad. Thanks Mr. GVT for the pity lobbyist dollars. We sure sold those 'optics' well.

    Pfft, it's the same shit with collecting all the data about people yet people analytics is still just about as confounding as possible. Good luck predicting employment behaviors.

    It's written like typical 4chan greentext, but in a different font

    The sad thing is that it's not that far from reality.

    A few weeks ago I saw a keynote speaker push for the use of AI in government. He repeatedly backed up his slides with Chat GPT spot illustrations. His big push was for 'non-coders' to get AI to automate things for them. He really didn't like when I asked about whether the speed and convenience was making people lower their standards? Or if there were risks in having people push things out in areas where they weren't trained and couldn't spot basic fundamental errors? I based this on me formerly being an illustrator and how I'd never had a client that would overlook basic composition or spelling errors, let alone people with three legs, yet this guy was happy to let all that and more go out to 1000+ people while undermining all his points.

    Of course, totally with you there. I have to sit through some incredible meetings where people just make up shit to feed success stories to management. I just don't want us to spread satire as if it's reality when the reality is ridiculous enough on its own.

    "Can you tell me a little about such an 'AI' (as you use it here) works?" Follow up "What would be strengths and disadvantages?"

    You want to know why not to use basic prompts to create spot illustrations? Or what skills people would need to make the resulting images usable? Or why it's a terrible idea to get people to get people who don't have the relevant skills to use such services and put the resulting output in front of an audience and call yourself professional?

    Well for a start, asides from the ethical issues of where half the AI services scraped the data for their models on, lets address the fact that while the various platforms can predict language, they can't actually understand what you are typing.

    People tend to instinctively understand how many limbs or heads things should have (and thus wouldnt fuck that up even from the thumbnail stage) whereas most commercial services don't. This creates an issue where the feedback of 'in the background you have two people in the second row with conjoined heads, please separate them' your Groks, your Copilots and your stable diffusions are going to do fuck all. Realistically if you were hellbent on using AI you'd still need a trained individual to use photoshop's infill tools to fix things.

    So besides the ethical considerations, the technical considerations, and the fact that you can't copyright AI produced works (meaning that anybody else could freely use and monetise your work), and the fact that you still need trained individuals to vet and fix produced works, I'm not seeing a ton of upsides.

    How about you write me an essay on why putting easily spotted and fixable mistakes up on screen infront of several thousand people undercuts the argument that AI generated data doesn't need proffesionals to check it before use?

    Yup.

    That’s not what’s going on in reality. The part I worry about are all the people who are gonna wake up to be blindsided one day soon. “Oh shit, it really did fundamentally change some important aspects of work.”

    I don't think it reads as someone attempting to convey a real thing that really happened. It's overblown, but it does convey the pitfalls off the house of cards that is corporate adoption of AI for ill-defined purposes.

    It reads like the unemployed's idea of a job

  • Guys, the poster is quoting someone on Twitter. Thats what - @gothburz at the end means.

    But I'm sharing the reddit post because.... well.... that's where I found it and... really... fuck X, man. I ain't linking to that dumpsterfire. :D

    Hopefully kosher with the rules of the sub.

    Youre not the one im talking about. The comments arent getting that its poetry.

    Media literacy needs to be taught in schools

    You’re a good one rabbi

    They quoted the person on Twitter and it's fine to put it here, it's the giving the redditor more credit in the title thats the issue for me. Instead of saying "explains" could gone with "quotes" or "relays"

    Agreed. However, I also accidentally misspelled their name and I can't go back and edit the title so I gave them slightly less credit than planned. ;)

    I ain't linking to that dumpsterfire.

    Xcancel is your friend.

    Just put "cancel" after the x in any Twitter URL and it goes to a Nitter instance. No log in required, no tracking, no Nazi dumpsterfire.

    Duly noted and saved for later!

  • Their comment reads like a satirical LinkedIn post

    It would be satire if it wasn't rather true. Every investor wants to see your AI-first strategy, even if it makes no sense in your product, it's buzzword compliance to keep investors happy, just like when half the corporate world was rolling out a blockchain that made no fucking sense.

    Isn't that the point of satire? That it's exaggerating things that are actually happening?

    Also, did anyone actually "roll out a blockchain" or did they ask just say that was happening somehow?

    I wish it were an exaggeration.

    Oh, plenty of companies did. Largely as a "far shittier and slower database you can't delete from", that they then promptly abandoned when they realised that they could just use a real database.

    AWS and IBM released managed blockchains, I'm sure GCP and Azure did too, it was wild.

    Dang. This future is frustrating.

    That's the issue with public markets. 10% of the money in AI is in trying to make products and services people might actually want to use. 90% is in hyping up investors to believe it's going to make a trillion dollars. Doesn't matter how, you just have to make them think it will.

    Our company has like 35 copilot licenses for the entire site of over a thousand people, and they monitor it and if you don't use it, you lose it and it gets transferred to the next person in line. They all get used, a lot.

    It is, because that's where they stole it from

    Copying a post from another platform while giving the source is not stealing.

  • We had a mandatory AI training we had to take about the AI tools our company has invested in. Here's the gist:

    • Our company has specific pre-approved AI tools you're allowed to use

    • Please use them, they're great

    • Don't use any AI tools other than those specific ones

    • Manually verify any answer that AI gives you cause AI can lie

    • Manually verify any numbers that AI gives you cause AI can lie

    • Don't enter anything proprietary into AI

    • Verify any information provided by AI in the form of a customer deliverable before showing it to a customer

    • Seriously, please use the AI tools, they're great

  • Is this post itself not written by AI? Or is that the joke and I'm missing the point.

  • This guy sounds like he made it up with ai 😂🤡

  • Kinda wish I had a job at a place like this. Seems like it would be easy to skate by w minimal effort when nobody cares or knows anything.

    At my job all the devs are using Copilot and are expected to master it and stay on top of it. And they're getting a pretty good result for the most part.

    We crank out software at a rapid pace that generates billions of dollars of revenue and supports trillions of dollars of financial transactions and people definitely know what's going on in minute detail, what our competitors are doing, and what our customers are asking for. As well as how our systems are performing. We need 99.999 etc uptime, multiple redundancy in all systems, and unit tests and exhaustive QA at every step of the way.

    Maybe there's a bunch of Dunder Mifflins out there but I've never worked at one.

    Seems like it would be easy to skate by w minimal effort when nobody cares or knows anything

    I've worked in places where skating by with minimal effort was the norm, and anyone who tried to 'do more' was indirectly punished for it.

    It was miserable and life draining for me. My shifts dragged on and on, the clock slowed down, and I felt guilty for being so unproductive on top of it. Some people seem to do okay in such environments, but not everyone.

    I'd much rather be working at a place where actual problems were being solved in at least a somewhat efficient manner. Provided they treat their employees decently.

    There's a lot of companies that operate like this. But it's not fun working in them.

    Unless you share in the delusion you have to be a very manipulative person to constantly validate their hallucinations.

  • It might be satire, but honestly from what i keep noticing is, that this sort of approach to introducing AI is not even that uncommon (minus obviously comic inserts). Is how fundamentally most people misunderstandd the use of AI in corporate process, how most people, despite many using various ai chats daily, still don't grasp how to use ai as a tool and not a chat buddy or someone (intentional personification) that does your work instead of you. Is not any sort of "prompt engineering" but just understanding the basic concept at what llms are good for and what they are bad at.

  • Wake up babe. New copypasta just dropped.

  • reposted by ai, upvoted by ai

  • The post is satirical but I think people are kinda missing that this sort of "we make metrics to fulfill our metrics because when our metrics go up the boss is happy" dynamic is actually a lot of how big organizations work.

    I've worked for large non-profits, I have family who's worked in government and large companies, people close to me have put in time in corporate America - there are an astonishing number of jobs that exist purely to generate emails.

    David Graeber would call these bullshit jobs but the principle is the same - a job that exists largely because it helps someone else look good at their job.

    AI is a huge opportunity to make people working these kinds of jobs be able to look super productive while actually not doing much of anything and shifting blame for the inevitable shitstorm onto someone else.

  • I mean if your job is fake yea

  • This is a satirical post about corporate generated largely by LLM, posted on Twitter, copied on reddit, and posted in this sub... why?

  • [deleted]

    This was written by AI.

    Based on what?

    Overly terse sentences, overuse of quotations, and terms highly enriched in LLM output, like "digital transformation".

    It tests as LLM-generated with high confidence.