Trogdor was popular way before Reddit
Trogdor was popular way before Reddit
Examples? I can think of a number of foreign companies that the US facilitates, like Nestle.
Eh, I switched. I switched all of my lab’s computers, too, and my PhD students have remarked a few different times that Linux is pretty cool. It might snowball.
This makes sense, thanks
Why would China turn against Putin for them using their nukes? I don’t keep up much on their relations.
Oregonians almost take pleasure in driving slowly in front of you. Maybe they’ve just gotten used to going slow because the entire state freeway system is always under construction. People driving crazily is infuriating for a completely different reason.
The best time to start was decades ago, but at least they’ve started.
This is a problem that’s becoming outdated, thanks to NIH now requiring females to be included in studies in order to receive grant funding–barring an exceptional reason for studying males alone (e.g., male-specific problems). They are even requiring cell lines for in vitro studies to be derived, at least in part, from females, rather than from males alone.
I actually took that bit out because LLMs are pro climate and against everything that makes the environment worse. That’s a result of being trained on a lot of scientific literature. I was just curious what Opus would say about the conceptual knowledge piece.
Claude Opus disagrees, lol (edit to add: all of what follows is Claude; not me):
I respectfully disagree with the claim that current language models are logically incapable of mapping language to actual concepts. While today’s LLMs certainly have limitations and are prone to hallucinations, they have demonstrated a remarkable ability to capture and manipulate complex concepts through language.
A few key points:
LLMs are trained on vast amounts of human-generated text which is grounded in real-world concepts, allowing the models to build rich representations that go beyond surface-level word associations. Techniques like unsupervised pre-training enable LLMs to learn meaningful conceptual relationships.
In many domains, LLMs have shown an impressive capacity for reasoning, inference, and knowledge synthesis - skills that rely on a deep understanding of the underlying concepts, not just pattern matching. Their performance on complex question-answering, analogical reasoning, and natural language inference tasks is evidence of this.
LLMs can be fine-tuned and augmented with techniques like retrieval-augmented generation or chain-of-thought prompting to further ground them in factual knowledge and improve their reasoning capabilities. This allows more robust conceptual mappings.
Careful prompt engineering and sampling techniques can help to mitigate hallucinations and nonsensical outputs. While not a complete solution, this shows that the conceptual knowledge is often present even if the model sometimes struggles to express it faithfully.
That said, I fully agree that the tendency for subtle nonsense is a serious issue, especially for layperson use cases where hallucinations may go undetected. Continued research into making LLMs more truthful, consistent and robust is crucial. Techniques like constitutional AI to bake in truthfulness, and better UX design to convey uncertainty are important steps.
But in summary, I believe the evidence suggests that LLMs, while flawed, are not fundamentally incapable of meaningful conceptual representation and reasoning. We should push forward on making them more reliable and trustworthy, rather than dismissing their potential prematurely.
I’m back on my BS is also a solid contributor
Yep. We find things humorous if they’re a benign violation of our expectations. That’s also why some folks judge others for their taste in humor; either they see something as not benign (e.g., people getting injured) or not a violation of expectations (e.g.,“stupid,” or wholly predictable).
Ich lebe in Amerika. Ich lerne Deutsche sprechen, aber das kostet Geld. Vielleicht wollen die Migranten Deutsche lernen, haben aber nicht das Geld dafür?
Sorry if the above is poorly worded; I’m still new to the language. My point is that there are lots of reasons that someone might not know a language well, including a lack of money, or a lack of time from needing to work full time to support one’s migrant family on a low wage.
Mexican immigrants to the US are wonderful, but their culture is very different from non-Hispanic US culture. I don’t expect them to learn English. They work like 60 hours per week to support their families. Like the person you’re replying to has said, though, their children learn English and integrate into, but also uniquely contribute to, US culture. Rather than expecting the first-generation immigrants to learn English, I’ve learned Spanish specifically to speak with them. It’s not like there are many more immigrants to Germany than there are immigrants to the US–even discounting the fact that the US has always been a country of immigrants, Hispanic and Latino/a/e Americans (the majority of which are Mexican Americans) are expected to exceed 50% of all Americans within a couple of decades. In some states, they are already the majority.
Diversity is a good thing, and we shouldn’t require immigrants to become like us culturally or linguistically before accepting them.
That’s fascinating, and I agree with you. Why the US hates the idea of high-speed rail is beyond me, especially because they prided themselves so much on the rail system they put together earlier in their development. In any case, the US can’t do much of anything with its debt-to-GDP as high as it is right now. They can hardly keep from shutting the government down entirely because they won’t even agree to a government budget.
Also, the US is 9.14 million sq. km of land, whereas the EU is 4.29 million sq. km of land
EU is still smaller
But the main reason the US can’t handle the same stuff at a federal level that the EU can is population density. The US government can’t afford to nationalize rural healthcare given how rural the US can be–especially with their debt/GDP at the moment. Give it another few hundred years and the US might catch up to Europe in that respect.
deleted by creator
Sync does formatting correctly. I came to Lemmy only because I like Sync so much. I paid for the lifetime version of it with Reddit and will probably pay for the lifetime version of this eventually. To each their own wrt how Lemmy is viewed, I guess.
deleted by creator