Perspectivist

joined 2 weeks ago
[–] Perspectivist@feddit.uk 1 points 11 minutes ago

I doubt it. They just think others do.

[–] Perspectivist@feddit.uk 12 points 2 hours ago (3 children)

Sure - it's just missing every single one of my friends.

[–] Perspectivist@feddit.uk 24 points 3 hours ago* (last edited 11 minutes ago) (6 children)

I wish I had Elon Musk money so I could buy this platform and turn it back to pictures only with the main focus on professional and hobbyist photographers - not pictures of food and selfies. It used to be one of the few social media platforms I actually liked.

[–] Perspectivist@feddit.uk 4 points 3 hours ago

Not really but we occasionally refer to it as "Big Black" for obvious reasons.

[–] Perspectivist@feddit.uk 1 points 5 hours ago

The best coffee I've ever drank was from Aeropress but honestly, if you use freshly ground beans on a Moccamaster they're quite difficult to tell apart.

[–] Perspectivist@feddit.uk -3 points 7 hours ago

I don't wish to kill anyone and reading these comments makes me sick.

[–] Perspectivist@feddit.uk 9 points 20 hours ago (1 children)

It’s not to protect it from cracking - it’s to stop the leftover coffee from burning onto it, since I only rinse it after use.

[–] Perspectivist@feddit.uk 1 points 20 hours ago

I don't waste good coffee.

[–] Perspectivist@feddit.uk 8 points 20 hours ago (3 children)

It's intentional. Leaves an air gap between the pot and the hotplate.

[–] Perspectivist@feddit.uk 5 points 1 day ago

When I make coffee just for myself, I always measure out the same amount of water and this never happens. But my SO is slightly less autistic about it than I am and makes inconsistent amounts when brewing for the two of us - and I just can’t stand the thought of pouring even a drop of coffee down the drain. So, I spill it on the table and floor instead.

[–] Perspectivist@feddit.uk 3 points 1 day ago (2 children)

I live in a small granny cottage and "my desk" means the kitchen table 2.5 meters away. I technically could move it to my desk and it would still remain in the kitchen.

[–] Perspectivist@feddit.uk 1 points 1 day ago* (last edited 1 day ago)

The level of consciousness in something like a brain parasite or a slug is probably so dim that it barely feels like anything to be one. So even if you were reincarnated as one, you likely wouldn’t have much of a subjective experience of it. The only way to really experience a new life after reincarnation would be to come back as something with a complex enough mind to actually have a vivid sense of existence. Not that it matters much - it’s not like you’d remember any of your past lives anyway.

If reincarnation were real and I had to bet money on how it works, I’d put it down to something like the many‑worlds interpretation of quantum physics - where being “reborn as yourself” just means living out one of your alternate timelines in a parallel universe.

 

Now how am I supposed to get this to my desk without either spilling it all over or burning my lips trying to slurp it here. I've been drinking coffee for at least 25 years and I still do this to myself at least 3 times a week.

140
submitted 1 day ago* (last edited 1 day ago) by Perspectivist@feddit.uk to c/til@lemmy.world
 

A kludge or kluge is a workaround or makeshift solution that is clumsy, inelegant, inefficient, difficult to extend, and hard to maintain. Its only benefit is that it rapidly solves an important problem using available resources.

 

I’m having a really odd issue with my e‑fatbike (Bafang M400 mid‑drive). When I’m on the two largest cassette cogs (lowest gears), the motor briefly cuts power once per crank revolution. It’s a clean on‑off “tick,” almost like the system thinks I stopped pedaling for a split second.

I first noticed this after switching from a 38T front chainring to a 30T. At that point it only happened on the largest cog, never on the others.

I figured it might be caused by the undersized chainring, so I put the original back in and swapped the original 1x10 drivetrain for a 1x11 and went from a 36T largest cog to a 51T. But no - the issue still persists. Now it happens on the largest two cogs. Whether I’m soft‑pedaling or pedaling hard against the brakes doesn’t seem to make any difference. It still “ticks” once per revolution.

I’m out of ideas at this point. Torque sensor, maybe? I have another identical bike with a 1x12 drivetrain and an 11–50T cassette, and it doesn’t do this, so I doubt it’s a compatibility issue. Must be something sensor‑related? With the assist turned off everything runs perfectly, so it’s not mechanical.

EDIT: Upon further inspection it seem that the moment the power cuts out seems to perfectly sync with the wheel speed magnet going past the sensor on the chainstay so I'm like 95% sure that a faulty wheel speed sensor is the issue here. I have a spare part ordered so I'm not sure yet but unless there's a 2nd update to this then it solved the issue.

 

I see a huge amount of confusion around terminology in discussions about Artificial Intelligence, so here’s my quick attempt to clear some of it up.

Artificial Intelligence is the broadest possible category. It includes everything from the chess opponent on the Atari to hypothetical superintelligent systems piloting spaceships in sci-fi. Both are forms of artificial intelligence - but drastically different.

That chess engine is an example of narrow AI: it may even be superhuman at chess, but it can’t do anything else. In contrast, the sci-fi systems like HAL 9000, JARVIS, Ava, Mother, Samantha, Skynet, or GERTY are imagined as generally intelligent - that is, capable of performing a wide range of cognitive tasks across domains. This is called Artificial General Intelligence (AGI).

One common misconception I keep running into is the claim that Large Language Models (LLMs) like ChatGPT are “not AI” or “not intelligent.” That’s simply false. The issue here is mostly about mismatched expectations. LLMs are not generally intelligent - but they are a form of narrow AI. They’re trained to do one thing very well: generate natural-sounding text based on patterns in language. And they do that with remarkable fluency.

What they’re not designed to do is give factual answers. That it often seems like they do is a side effect - a reflection of how much factual information was present in their training data. But fundamentally, they’re not knowledge databases - they’re statistical pattern machines trained to continue a given prompt with plausible text.

 

I was delivering an order for a customer and saw some guy messing with the bikes on a bike rack using a screwdriver. Then another guy showed up, so the first one stopped, slipped the screwdriver into his pocket, and started smoking a cigarette like nothing was going on. I was debating whether to report it or not - but then I noticed his jacket said "Russia" in big letters on the back, and that settled it for me.

That was only the second time in my life I’ve called the emergency number.

view more: next ›