Dilbert_X
The X stands for
+1,815|6362|eXtreme to the maX

RTHKI wrote:

My right car speaker sounds like the music Uzique listens to
You mean this?

Get that shit fixed.
Fuck Israel
uziq
Member
+497|3708
that person has oscar nominations, i think she's doing alright.
RTHKI
mmmf mmmf mmmf
+1,741|6993|Cinncinatti
Well no one is perfect

I like it
https://i.imgur.com/tMvdWFG.png
KEN-JENNINGS
I am all that is MOD!
+2,979|6888|949

Three 6 Mafia has an Academy Award
uziq
Member
+497|3708

KEN-JENNINGS wrote:

Three 6 Mafia has an Academy Award
i'm not suggesting it's an arbiter of quality. i'm not exactly in the habit of listening to film soundtracks like a cretinous simpleton, myself. rather, i'm pointing to the fact that she's acclaimed in a way that dilbert would recognise. lady gaga has an academy award.

i don't expect dilbert to lend any sort of time or patience to 'experimental' music, but hey sometimes this stuff actually wins mainstream awards too.







she is ridiculously talented. she has done some of the most memorable and distinctive film soundtrack work in recent years, and also plays in small noisy punky bands in london and produces underground break-out stars like tirzah. pretty hard to hate on someone with that range and singularity, tbh, if you're at all appreciative of creativity.
SuperJail Warden
Gone Forever
+642|3975
December 23rd 8:30 AM in Manhattan. I called the lady at the post office window a rude bitch. I am the first world problem today.
https://i.imgur.com/xsoGn9X.jpg
Dilbert_X
The X stands for
+1,815|6362|eXtreme to the maX
I dismantled the ceiling fan, organised the nuts and bolts in bowls.

The Cat scooped them all out and pushed them under the cupboard.
Probably lucky she didn't eat any.
Fuck Israel
unnamednewbie13
Moderator
+2,053|7028|PNW

Evolution journal editors resign en masse
https://arstechnica.com/science/2024/12 … -and-more/

This bit:
Worst practices

In-house production has been reduced or outsourced, and in 2023 Elsevier began using AI during production without informing the board, resulting in many style and formatting errors, as well as reversing versions of papers that had already been accepted and formatted by the editors. “This was highly embarrassing for the journal and resolution took six months and was achieved only through the persistent efforts of the editors," the editors wrote. "AI processing continues to be used and regularly reformats submitted manuscripts to change meaning and formatting and require extensive author and editor oversight during proof stage.”
This hits home from just my own limited meddlings in generative AI as a toy. If I was making something serious, it would be easier to just write or illustrate it myself rather than burn the midnight oil making little microadjustments to a prompt.

I don't think I'm a neo-Luddite.

I'd like to know if stuff I'm looking at has AI involved. I'd rather not get three paragraphs into something masquerading as a human-written piece and catch some obvious GPT-ism they didn't clip out. Waste of my time.

At some point in my job, I'm pretty sure I'll end up submitting some report that gets AI-mangled beyond reason by a third party somewhere down the line. And if my name's still on it, PC might get defenestrated.

Imagine this as a topic on fresh-faced BF2S back when people were joking about inept bots in their favorite video games. Now the bots are writing bad articles.

Last edited by unnamednewbie13 (2025-01-04 09:19:52)

uziq
Member
+497|3708
elsevier are just about the biggest villains in any sector of publishing -- the fruits of robert maxwell's labour, in addition to ghislaine and rest of the brood. academics have been at war with elsevier et al as a class for decades over unethical and shady practices: they are one of the main reasons for the entire drive for open access in academic research. it wouldn't surprise me if that entire resigning editorial board for a top journal weren't already all doing that work voluntarily, for free, in the first place, even without all the new AI controversies and issues.

they were in hot water over AI a long time before this, as they entered into agreements/partnerships with various new parties in the AI world, allowing them to train on elsevier's vast archive of content/data to build their models ... all without informing or gaining the consent of the hundreds of thousands of researchers. it's a mirror image of the copyright court cases which are still ongoing/rumbling through the system now from celebrity authors and their publishers. many people do not take kindly to having their life's work fed into a vast recombinatorial machine in order to produce semi-meaningless landfill slop.

pretty sure production for elsevier journals was outsourced years ago, anyway. if you're publishing with one of the corporate giants, your choice is between typographical errors introduced by overworked indians in chennai or having your work marred by cutting-edge AI bollocks. the giant academic publishing conglomerates have a class of fattened geese called 'project managers' which far outnumbers any workers with actual editorial/production nous, and they love to introduce this sort of modish/faddish shite to already distended and labyrinthine production processes involving stakeholders in 3 different continents, in order to continue justifying their existence as a middle-tier of the organisation.

Last edited by uziq (2025-01-04 10:14:48)

unnamednewbie13
Moderator
+2,053|7028|PNW

i cannot imagine expecting coherent results from feeding even a simple list of technical bullet points into the kind of software that cannot count the number of Rs in the word 'strawberry' (and can even contradict itself with different wrong answers) until you give it very specific instructions and targeted slaps on the wrist.

that management types trust that stuff would be more flooring to me if i hadn't had conversations with otherwise reasonably coherent individuals who will tell me with big eyes and a straight face what they learned from grok like they had deep conversation with the hal 9000.

it's only going to get worse. macbeth makes tests with chatgpt.
SuperJail Warden
Gone Forever
+642|3975
Not just test. Entire lessons. Good lessons. It saves me a bunch of time. I can produce reading assignments for student lightning fast. Higher quality and far cheaper than Study.com lesson plans.

Behold.
https://i.imgur.com/r5EOGoT.jpeg
The reading assignment on Napoleon was generated across 10 slides. I included underlined text and keywords in order to support the special education students. We read it together in class. They answer two questions at the end as an exit ticket.

I created a 5 power point series for my district. I was paid $50 for it. It went into the district folder and kids are using it now for whenever teachers need to fill a week about the Napoleonic Wars. Full 5-day special education compliant reading lesson ready to go.

AI Won’t Replace Humans — But Humans With AI Will Replace Humans Without AI
I could talk more about how powerful these tools are in the right context.

And before Uzique yells at me, I understand your job is to collab with a human on making something appeal better to human readers. My job is to collab with children/parents. AI will make you much more productive. Give it more of a chance.
https://i.imgur.com/xsoGn9X.jpg
unnamednewbie13
Moderator
+2,053|7028|PNW

chatgpt wrote:

“As his eyes scanned the horizon, a profound softness swept over his gaze, the weight of history pressing gently against his soul. Before him lay the smoke-streaked remnants of battle, but beyond that, the sprawling expanse of a new dawn—an era forged in the crucible of revolution. His thoughts, though momentarily lost in the chaos of the moment, turned toward the enduring legacy of this monumental struggle. The French Revolution, born in the fires of 1794, would irrevocably shape the course of civilization, leaving its indelible mark across every future generation.”
lightning-fast, quality stuff. basic prompt admittedly but chatbots do have a sort of recognisability, or certain stench if you will.

e:
his legacy, marked by both military genius and ambition,
that's so gpt. after awhile of this junk, some of it starts sticking out like a sore thumb like even a 150px thumbnail of ai-generated content will. if more teachers do this we'll be training a generation of people to talk like robots, lol. it'll be a choice between something like chicago manual or sticking to the uncanny valley of deeply ingrained, open-ai conditioning to prefer vague, off-key, sweeping generalisations and grandiose statements in their everyday communication.

Last edited by unnamednewbie13 (2025-01-04 22:23:53)

Dilbert_X
The X stands for
+1,815|6362|eXtreme to the maX
My robot stepper/driver combination doesn't seem to want to do 1/8 micro-stepping any more.
1/4 Seems to work, it seems to be some current control thing.
OK so its a cheapo robot with single card on-board stepper drivers.
I should just buy a whole new robot/driver system with discrete stepper drivers or really a proper servo setup.
Maybe I'll save that for the other robot I'm building.
Fuck Israel
uziq
Member
+497|3708

SuperJail Warden wrote:

And before Uzique yells at me, I understand your job is to collab with a human on making something appeal better to human readers. My job is to collab with children/parents. AI will make you much more productive. Give it more of a chance.
i couldn’t care less about AI being used to help whip together teaching materials or handouts or whatever.

the fact young people today are significantly more illiterate and attention deficit than 10 years ago would suggest all this miracle technology isn’t producing great results, though.

and the university essays and examinations picture is even worse. i can’t see how they’re going to really deal with it without reverting to pencil and paper examinations. like it is that systemic. the rot runs through the whole system. academics waste an awful lot of time in an endless cat-and-mouse game, using AI plagiarism tools to try and catch signatures of AI cheating tools, and so on and so on.   

and i keep seeing videos of tech workers taking interviews with top companies while relying on an AI bot to listen and feed them answers. this ‘powerful’ technology so far seems to be utilised towards a whole bunch of ‘faking it until you make it’. is that really the best ethos in a pedagogical/educational context?

i just saw the other day on twitter an economics professor at dartmouth, educated at harvard, propose using AI to rewrite ‘tedious’ books written in ‘high formal language’. he wasn’t talking about précis of milton: he was talking about novels from before the 1970s. flowery language? strange locutions? let’s smooth all of that away to get a bland summary text. we need to maximise information content! who cares about reading or writing skills?

this is the kind of slop thinking that passes in ‘enlightened’, ‘educated’ circles even at prestigious institutions now. there’s a certain ideology of techbro trans-humanism that is missing the entire point of reading/literacy and learning more generally. any pain or friction associated with difficulty should be smoothed away by a benevolent chat bot. the results will not be good, let me tell you.

relying on an energy-guzzling chatbot to feed you with basic information you should have already acquired through basic elementary education doesn’t quite seem like the miracle civilisation-altering tech that the prophets in SV would have us believe.

again, in terms of quickly whipping up a summary or a lesson plan or a document template or a piece of boilerplate legal-contractual copy or 1000 lines of code … seems like a cool productivity tool. but it’s being used as a cognitive plug a lot of the times instead.

Last edited by uziq (2025-01-05 01:15:30)

SuperJail Warden
Gone Forever
+642|3975

uziq wrote:

the fact young people today are significantly more illiterate and attention deficit than 10 years ago would suggest all this miracle technology isn’t producing great results, though.
Back in 2001 there was an article about how students were becoming reliant on digital tools. One thing it mentioned or maybe it was the follow up mentioned was how people were worried about Google and spell checkers making us worse off. They were talking about our generation. We turned out fine. The kids will be fine.

...

The AI is great assistive technology for disabled people anyway.

https://disabilityrightswa.org/publicat … -students/

I don't think you guys realize that there are many people who read/write at a level far below than what we are doing right now. They can't just type up an email asking for help or a refund like we can. AI helps with that. If anything we need to teach students how to use that stuff. Doubly so if we are going to teach them life skills to be able to function as adults.

...

I agree that colleges will need to switch to pen and paper/oral assessments. Both systems have their good and bad points. Autistic and anxiety prone individuals would struggle under such a system.

...

Colleges are in flux right now. Many are going out of business. That is a great thing. These academic institutions created some rigid systems and policies that actually made things harder for their students they were meant to serve. Many professional and graduate programs are now fully online much to the benefit of the students. Not having to go onto a campus to learn like it is still the mid 60s is a big improvement. It has made stacking professional degrees/knowledge more manageable than not. Expanded the knowledge base by making things less traditional.

I am going to do a Ed.D. soon. Most of the programs are cohort based on which you need to follow a tight schedule and can get bounced from the program for falling a little behind. Why the hell is that necessary? Colleges want people to spend tens of thousands of dollars and then also wants to be dick's about the experience. A lot of education is like that and the MAGA movement is frankly not wrong about academia being up their own ass.

Last edited by SuperJail Warden (2025-01-05 01:54:49)

https://i.imgur.com/xsoGn9X.jpg
unnamednewbie13
Moderator
+2,053|7028|PNW

wheelchairs are an awesome mobility tool for the disabled too. but if the average person relied on them for getting around on a day-to-day basis, we'd probably see a lot more atrophied legs in the world.

i feel weird about the comparison of spellcheckers to full on ai-composition (see also, calculators and math classes), in terms of skill-replacement/decay, effort, and resource consumption. when word processors began suggesting spelling and punctuation corrections, overall quality of everyday messaging in a lot of environments probably went up with relatively fewer lakes drained as a result. i imagine squiggly underlines corrected a few long-held misconceptions in its heyday (although not without margin of error or sometimes unwanted or annoying scribbling).

if AI is placed on this same graph, it probably represents somewhere in between diminishing returns and outright downgrade. between greenlighting suggestions made by ms word or having to rewrite or exhaustively reprompt ai output until it's passable but not great, the former presents less of a headache. even having ai autosort bullet points needs to withstand careful scrutiny. one typo or misunderstood fleck of input and suddenly *checks notes* napoleon with his Military Genius and Charisma and Tactical Brilliance is squaring off against darth vader in a cataclysmic showdown destined to shake the tides of reality and leave a spiritual mark on history for all future ge╟erations to come spark harvest city hall.

side note, the maga movement is literally, as a matter of principle, following people who've been lost up their own asses for years. trump's got a whole posse of auto-anal spelunkers lined up. he's passing out the headlamps as we speak. the trumpers are cherry-picking. it's sad and hilarious hearing some maga guy with his entire yard festooned with crass slogans and 45-47 kitsch grouch about anyone being up their own ass. "did i ever tell you the time i talked to elon musk on twitter? he liked one of my comments."

familyguyvomit.gif

Last edited by unnamednewbie13 (2025-01-05 03:17:22)

unnamednewbie13
Moderator
+2,053|7028|PNW

tl;dr, a teacher using ai to compose napoleonic lesson slides for a special needs class probably falls somewhere in the category of small fish to fry. i will say though that it reminds me of those videos of people yoinking sewer goo like thieves in the night to process into cooking oil.
uziq
Member
+497|3708

SuperJail Warden wrote:

Back in 2001 there was an article about how students were becoming reliant on digital tools. One thing it mentioned or maybe it was the follow up mentioned was how people were worried about Google and spell checkers making us worse off. They were talking about our generation. We turned out fine. The kids will be fine.
while it's right and proper to be wary of any 'back in my day...'/'the young people of today ...' talk, i do think that particular analogy fails in several respects.

the web/2.0 search engine version of the internet was qualitatively very different from the cloud/platform, applet-for-everything, AI-assisted, algorithmically determined web of the present. in terms of knowledge dissemination and learning, there were many structural similarities between early search engines, forums/bb-code communities, email and mailing lists, etc., and the forms of knowledge distribution that came before them. it was relatively easy to transition to 'online' research methods using the same 'research skills' as taught in the pre-digital epoch (i know, having taken a research masters and seen how the introductory research skills lectures were adapted for both paradigms). people who could independently direct their own learning, by exploring and finding their way around the library stacks, for instance, could adapt pretty easily to finding sites and the right info using search engines. ditto spellcheckers - these are just much more efficient dictionaries/thesauruses. how is that seriously disrupting people's learning practices or work? you still need to be literate and have a vocabulary before using them.

for instance, the transition to digital word processing or file systems really maps onto previous working and archiving practices. anyone who is familiar with a file cabinet can understand how to structure a directory of folders on a digital operating system like windows. this is where the entire concept of skeuomorphs come from in design: there's a clear epistemic continuity between the floppy disk icon and the action of saving work that was legible to our generation, whether or not we actively used floppy drives or not. the youth today don't even know how to save files to a local machine, or good practices for organising their information. that's deleterious for things like research and being able to organise and think effectively for yourself. these youths are never getting out of the walled gardens of tech companies, or learning how to learn without the training wheels off. we had google, but we also had a plethora of alternatives that we knew how to use before/during google's early days; we were sure as hell never dependent on it.

SuperJail Warden wrote:

The AI is great assistive technology for disabled people anyway.

https://disabilityrightswa.org/publicat … -students/
well, quite. but this stuff isn't being shoved down our throats as 'accessibility' technology. the apple vision pro is probably great for many people in that regard too, but i wouldn't want every student in a class attending remotely via VR headset or whatever.

SuperJail Warden wrote:

I don't think you guys realize that there are many people who read/write at a level far below than what we are doing right now. They can't just type up an email asking for help or a refund like we can. AI helps with that. If anything we need to teach students how to use that stuff. Doubly so if we are going to teach them life skills to be able to function as adults.
so what's the better solution? to tackle the hard but necessary task of making them literate, or to give them a shortcut tool that will basically keep them as pavlovian dogs for the rest of their lives, tapping a button and waiting for a response? this doesn't just apply to schoolchildren and the fundamentals of reading and writing, either; as mentioned in my previous post, it reaches all the way up to the most prestigious tech companies. recruiting people who fake or demonstrate a perfunctory understanding of code using AI crutches, when really they haven't covered the groundwork. we are going to rapidly run into a situation where 'we forgot how to do it' is no longer a meme. (i know this is a widespread problem in software engineering and tech stacks anyway, with the rapid pace of change ... but i mean society-wide in all forms of knowledge production.)

SuperJail Warden wrote:

I agree that colleges will need to switch to pen and paper/oral assessments. Both systems have their good and bad points. Autistic and anxiety prone individuals would struggle under such a system.
okay, no disagreements again, except that i don't think we should be structuring higher-education around the 1% of people with autism or whatever. overstressing that stuff is just as questionable as rejigging entire canons and curricula for people who may be 'triggered'.

SuperJail Warden wrote:

Colleges are in flux right now. Many are going out of business. That is a great thing. These academic institutions created some rigid systems and policies that actually made things harder for their students they were meant to serve. Many professional and graduate programs are now fully online much to the benefit of the students. Not having to go onto a campus to learn like it is still the mid 60s is a big improvement. It has made stacking professional degrees/knowledge more manageable than not. Expanded the knowledge base by making things less traditional.

I am going to do a Ed.D. soon. Most of the programs are cohort based on which you need to follow a tight schedule and can get bounced from the program for falling a little behind. Why the hell is that necessary? Colleges want people to spend tens of thousands of dollars and then also wants to be dick's about the experience. A lot of education is like that and the MAGA movement is frankly not wrong about academia being up their own ass.
simply disagree here. the switch to purely online professional courses is great for the administrative caste of bureaucrats and educationalists who run the university sector. minimal operating costs, reduced accountability, and potentially infinite student numbers? wahoo! but "cui bono?" there's a fine line between 'adapting to modern technology' and 'degree mill that only exists in www space'.

as for being compelled to attend in-person classes ... that's because part of the value and point of a college degree is to make you collegial. most shit in terms of the careers and professions don't happen in isolation; you need to meet people, network, develop interpersonal skills, and, yes, become habituated to being a 'college man' (as the old phrase went) - even if that does seem hoary and redundant and up its own ass. if you're really intending to go high up the ladder in terms of the education world, you will have to be good at all the ass-sniffing and back-clapping that in-person contact and networking teaches you. this isn't particularly deep or new, anyway. college even in the 19th century was about cohabiting with a bunch of people who may become potentially useful later on. it's entirely right that it should be a component of 'professional' postgraduate degrees. a person who gets an MBA online while living in south dakota with zero contacts probably isn't going far.

fair enough if you're not personally disposed to that, but the idea of paying for a graduate course and not getting to attend in-person classes or be part of a community on a campus or in a city seems depressing as fuck to me. for many people that is a hugely valued part of the experience. if you don't want to develop interpersonal skills or academic/professional networks, there's a whole range of other careers or consulting roles that pay bank. complaining about having to deal with in-person seminars and meetings while wanting to scale the education bureaucracy seems uh ... retarded actually. what do you envision your future career, after graduating, involving if not that?

i don't think the MAGA people have anything right about education in their worldview at all. at one end of the scale you seem to have a bunch of cynical billionaires who are trying to reform the ed sector to push their own ed-tech companies' shovelware down students'/teachers' throats - no matter how useful or beneficial it actually is. at the other end of the scale you find your garden variety ignoramus who thinks being unlettered is an a priori virtue. these are people who happily want to burn books according to political precept, ffs. throw in your lot with them at your own peril. that's just dumb.

Last edited by uziq (2025-01-05 08:17:47)

Dilbert_X
The X stands for
+1,815|6362|eXtreme to the maX
Yeah I don't know, umpteen things were supposed to have replaced humanity by now.
Expert systems without AI were doing a better job than doctors 30+ years ago, finite element analysis was supposed to have replaced engineers, Eliza was close-ish to being indistinguishable from a person.

AI will be a boost for good people who know how to use it, and a tool of coercion for everyone else - like most things through history.
It will enable everyone else to dumb themselves down even further, proper education being a mind stretching 'learning how to think and learn' process more than memorising 'facts' which can be found written down.
So yeah, create your lesson plan by AI, your students will use AI to write their essays and you can set AI to mark them.
99.9% of people were never going to have a creative or useful thought in their lifetimes so why worry?

Where it will get sticky is when AI is given control over actual things.
https://assets.ign.com/thumbs/2009/02/20/t2_skynet_blu_022009-12.jpg
Fuck Israel
unnamednewbie13
Moderator
+2,053|7028|PNW

it wouldn't surprise me to learn that new terminator movies are being written at least in part by literal chatbots. that would make for some accurate robot dialog, at least.

ai is already sticky, dilbs. california for instance had a whole battery of ai and deepfake laws, and the one about regulating ai decision-making in health insurance is currently making its rounds online. signed into law before luigi became a household name.

skynet is here, and it's nuking online discourse with fake news.
Dilbert_X
The X stands for
+1,815|6362|eXtreme to the maX

unnamednewbie13 wrote:

ai is already sticky, dilbs. california for instance had a whole battery of ai and deepfake laws, and the one about regulating ai decision-making in health insurance is currently making its rounds online. signed into law before luigi became a household name.
Who is that going to stop exactly?
Fuck Israel
uziq
Member
+497|3708
elizabeth warren will stop it. someone go and find her at rumba class.

Board footer

Privacy Policy - © 2025 Jeff Minard