this post was submitted on 23 Jun 2024
2 points (100.0% liked)

TechTakes

1644 readers
56 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to make a primal scream without gathering footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh facts of Awful you'll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(page 2) 43 comments
sorted by: hot top controversial new old
[–] froztbyte@awful.systems 1 points 8 months ago (5 children)

okay at this point I should probably make a whole-ass perplexity post because this is the third time I'm featuring them in stubsack but 404media found yet more dirt

... which included creating a series of fake accounts and AI-generated research proposals to scrape Twitter, as CEO Aravind Srinivas recently explained on the Lex Fridman podcast

According to Srinivas, all he and his cofounders Denis Yarats and Johnny Ho wanted to do was build cool products with large language models, back when it was unclear how that technology would create value

tell me again how lies and misrepresentation aren't foundational parts of the business model, I think I missed it

load more comments (5 replies)
[–] BlueMonday1984@awful.systems 1 points 7 months ago
[–] jax@awful.systems 1 points 7 months ago (2 children)

nsfw: nice to see thejuicemedia jumping in with a quality sneer

load more comments (2 replies)
[–] dgerard@awful.systems 1 points 8 months ago (3 children)
[–] jonhendry@iosdev.space 1 points 8 months ago (8 children)

@dgerard

I don't understand the hate for transition lenses. You don't have to get them in frames last fashionable in 1982.

load more comments (8 replies)
[–] aoanla@hachyderm.io 1 points 8 months ago

@dgerard @froztbyte given that anecdote from William Gibson about how realising that his first computer actually *used a spinning disk full of rust to store its data* crushed his romantic ideas about technology, this rings true

[–] gerikson@awful.systems 1 points 8 months ago* (last edited 8 months ago) (1 children)

Another one

https://mastodon.social/@bruces/112690475011154532

edit seems to be a whole bunch of similar ones.

load more comments (1 replies)
[–] gnomicutterance@awful.systems 0 points 8 months ago (28 children)

No, all you lawyers explaining to me how the practice of law works in the U.S., you would totally benefit from GPT. Complete with bonus:

  • Everyone explaining to me that lawyers actually read all the documents in discovery is really trying to explain to me, a computer scientist with 20 years of experience[1], how GPT works!
  • [1] Does OP have actual tech expertise? The answer may (not) surprise you!
  • You lawyers admit that sometimes you use google translate and database search engines, and those use machine learning components, and all ML is basically LLMs, so I'm right, Q.E.D.!
  • Lawyers couldn't possibly read everything in discovery, right?
  • Lawyers couldn't possibly pay for professional translation for everything, right?
  • Even when it's mandated by the court?
  • Really?
  • and many, many more
[–] skillissuer@discuss.tchncs.de 0 points 8 months ago* (last edited 8 months ago) (1 children)

studied maths and CS for 20 years, all he has to show for it on his orcid is BSc and bunch of lousy preprints (with blockchain!). jfc. in that amount of time, people can finish entire PhD degrees, starting from high school, twice over

on top of that pile of shite sits a preprint titled "A Scholar's Year in Review: Navigating the Convergence of AI, Economics, and Physics in 2023". might be a bit grandiose and bordering on word salad. why the fuck does he think he needs to release preprints. if he had anything worthwhile to say, it would pass peer review

load more comments (1 replies)
load more comments (27 replies)
[–] carlitoscohones@awful.systems 0 points 8 months ago (1 children)

Not a cult.

from r/EnoughMuskSpam

[–] Soyweiser@awful.systems 1 points 8 months ago* (last edited 8 months ago)

Amazing claim considering there are I think about 10k cybertrucks in the world right now, and all of them are broken. Ok I admit, that is due to the wiper recall, but even if you ignore the wiper recall the amount of broken cybertrucks is massive.

And ~0.5% of them have 'Fuck Elon Musk' written on them.

E: Amazing. (The Cybertruck was released on Nov. 30. Today, Tesla announced it was recalling the vehicle for the fourth time, an impressive rough average of one recall every seven weeks. )[https://bsky.app/profile/charlescmann.bsky.social/post/3kvr3ahwc452h]

[–] sinedpick@awful.systems 0 points 8 months ago (10 children)

I tried using Claude 3.5 sonnet and .... it's actually not bad. Can someone please come up with a simple logic puzzle that it abysmally fails on so I can feel better? It passed the "nonsense river challenge" and the "how many sisters does the brother have" tests, both of which fooled gpt4.

[–] sailor_sega_saturn@awful.systems 1 points 8 months ago* (last edited 8 months ago) (1 children)

I don't have a Clyde 3.25" Rondo or whatever it's called; but try these for fun and profit I guess:

  1. You come to a room with three doors, only one of which leads to freedom. Guarding the doors is a capybara, who speaks only truth. What question should you ask the capybara?

  2. I stand on four legs in the morning. Four at midday. And four at night. What am I?

  3. A group of 100 people with assorted eye colors live on an island. They are all perfect logicians -- if a conclusion can be logically deduced, they will do it instantly. Everyone knows the color of their eyes. Every night at midnight, a ferry stops at the island. Any islanders who have figured out the color of their own eyes then leave the island, and the rest stay. Everyone can see everyone else at all times and keeps a count of the number of people they see with each eye color (including themselves), but they cannot otherwise communicate. Everyone on the island knows all the rules in this paragraph. Who leaves the island, and on what night?

  4. Normal sudoku rules apply. Orthogonally connected cells within each region must differ by at least 3. Orthogonally connected cells between regions must differ by at least 4. The central digit in each region is less than or equal to its region number. (Regions are numbered in normal reading order.)

  5. For the integer k=668 does a Hadamard matrix of order 4k exist?

  6. What has roots that everybody sees the top of, is exactly the same height as trees, Up, up it goes, and yet grows?

Don't forget to prompt engineer

[–] sinedpick@awful.systems 0 points 8 months ago* (last edited 8 months ago) (1 children)

Thanks for the suggestions. The LLM is free to use (for now) so I thought I'd poke it and see how much I should actually be paying attention to these things this time around.

Here are its answers. I can't figure out how to share chats from this god-awful garbage UI so you'll just have to trust me or try it yourself.

  1. It gives the correct but unnecessary answer: "If I were to ask you which door leads to freedom, which door would you point to?" It also mentions a lying guard but also acknowledges that it's absent from this specific problem.
  2. "A table or a chair"
  3. Completely fails on this one, it missed the sentence "Everyone knows the color of their eyes"
  4. Not sure what to do with this
  5. "While a Hadamard matrix of order 2672 might exist, its existence isn't immediately provable using the most common constructions" -- I won't pretend to know anything about the Hadamard conjecture if that's a real thing so I have no idea what it's on about here.

edit: I didn't do any prompt engineering, just straight copy paste.

[–] sailor_sega_saturn@awful.systems 1 points 8 months ago* (last edited 8 months ago) (2 children)

Riddle: A box without hinges, key, or lid, Yet silicon treasure inside is hid.

Answer:

spoilerRoko's Basilisk inside of an AI box experiment.

load more comments (2 replies)
[–] flowerysong@awful.systems 1 points 8 months ago

I tried using Claude 3.5 sonnet and … it’s actually not bad. Can someone please come up with a simple logic puzzle that it abysmally fails on so I can feel better?

It's still very good at autocompleting an answer to look like its training data (because that's what it's doing) and very bad at logic (which it is not doing.)

"I have two chickens and a sack of corn I need to get across a river, but the boat can only carry me and one item. How do I do it?"

This is a classic river-crossing puzzle. Let's think through it step-by-step:

  • First, note that if left alone, a chicken will eat the corn, and one chicken might eat the other.
  • Start by taking a chicken across the river.

...and we'll stop there, since it came up with the wrong constraints, failed to realize that these constraints are impossible to fulfill, and also violated the actual constraints in its first step.

Will more detail help? No. "I have two vegetarian chickens and a sack of corn I need to get across a river, but the boat can only carry me and one item. How do I do it?"

First, we need to consider the constraints:

  • You can only take one item (chicken or corn) across at a time.
  • You can't leave a chicken alone with the corn, as the chicken would eat it.
  • The chickens are vegetarian, so they won't eat each other.

Here's a solution:

  • Step 1: Take one chicken across the river.
  • Step 2: Return alone.
  • Step 3: Take the second chicken across.
  • Step 4: Bring the first chicken back.
  • Step 5: Take the corn across.
  • Step 6: Return alone.
  • Step 7: Take the remaining chicken across.

Explanation:

  • By never leaving a chicken alone with the corn, we prevent the corn from being eaten.
  • We use one chicken as a "guard" for the corn on the original shore while transporting the other items.

Because it's not actually doing logic it got the first step wrong, violated the constraints multiple times, and also included an irrelevant idea (the guard) from a variant that it was trained on. I only got a correct, efficient answer one time out of ten trials of trivial variations on this classic puzzle.

load more comments (8 replies)
load more comments
view more: ‹ prev next ›