Chat GPT & Generative AI tools

A place to discuss anything.
Post Reply
User avatar
ShaunWhite
Posts: 10071
Joined: Sat Sep 03, 2016 3:42 am

napshnap wrote:
Fri Mar 07, 2025 1:52 pm
- Beep-bop-beep, shut up and keep mining, meat bags!"
This idea it'll try to dominate and take over makes the assumption it will have the same flaws as it's creators. With no ego and no desire to 'progress' then it's conclusion might be quite benign. Or we get melted down to use as robot lube so who knows eh.
User avatar
ShaunWhite
Posts: 10071
Joined: Sat Sep 03, 2016 3:42 am

Bit off topic but seems the right place.

In the words of GPT:
Upgrading to Office 2024 isn’t a simple install—it forces a full uninstall of Office 2019 first, meaning you lose settings, macros, templates, and UI customizations unless manually backed up. Export your ribbon/toolbars, macros (PERSONAL.XLSB), templates (Normal.dotm), and registry settings before uninstalling. Expect some manual fixes post-install. Microsoft isn’t making this hard by accident—they want perpetual license users to suffer just enough to consider switching to Microsoft 365. Also, if you have Visio 2019, it won’t work alongside Office 2024, forcing an upgrade to Visio 2024 or removal.

Luckily I got an MSDN licence for Visio Pro 2024 for £12 (rrp £640!!). They rarely blacklist them and at £12 they could do that 50 times and it would still be cheaper. My last key worked for 6yrs before the uninstall.
sionascaig
Posts: 1541
Joined: Fri Nov 20, 2015 9:38 am

Starting to see some articles on "jailbreaking" various AI's so no longer constrained by ethical or other restrictions / guidelines built in by the developers.

One article also hinted that a jailbreaked bot could then be released back to general public that followed your own instructions / guidelines.

Scary world indeed.
User avatar
Big Bad Barney
Posts: 325
Joined: Mon Feb 04, 2019 6:00 am
Location: Cairns Australia

sionascaig wrote:
Tue Mar 18, 2025 6:56 am
Starting to see some articles on "jailbreaking" various AI's so no longer constrained by ethical or other restrictions / guidelines built in by the developers.

One article also hinted that a jailbreaked bot could then be released back to general public that followed your own instructions / guidelines.

Scary world indeed.
Not sure they really work like that? You can try GPT4All to run your own models.... they chug unless you got a mammoth video card...

I got a pretty beefy machine (top of the line 2 years ago specced out sorta thing), and they chug.....
User avatar
Euler
Posts: 25869
Joined: Wed Nov 10, 2010 1:39 pm
Location: Bet Angel HQ

Trying to trick and get around restrictions is an ongoing game it seems.
You do not have the required permissions to view the files attached to this post.
Post Reply

Return to “General discussion”