Understanding Concepts

Hey guys , im learning by myself right now JS , i wanna hear your thoughts and explanations , should i get help from chatgpt to explain me concepts (without showing code ex) or should i not do that or use other material like official docs , mdn etc..
6 Replies
vince
vince4mo ago
Always cross reference with the docs But I don't see why using chatgpt to help you learn is bad, just use it as a supplement This server is quite biased against AI but I think as long as you use it right it can be a nice tool to use, just don't over rely on it
ἔρως
ἔρως4mo ago
we're "biased" because ai is known for making shit up, while trying it's best to convince you it is correct
Jochem
Jochem4mo ago
the main issue to beware of with AI is that it's going to lie, occasionally, and as a beginner you may not be able to spot it when it does Use AI to explain a concept to you, for sure, but don't rely entirely on that explanation. Use that new knowledge to look up first sources or test things out, and keep in the back of your head that what it said might not have been accurate or even relevant it's quite literally made up CSS properties for me in the past
big saf 🍉
big saf 🍉4mo ago
I would start just doing it. Pick a course and practice, AI can be used but it can be inaccurate and dependant
13eck
13eck4mo ago
Short version: asking Gippity to explain something is OK, asking it to write code is not. "What's the DRY principle?" or "What's the difference between a JavaScript Map and object?" are fine, but "write code to…" is not. Why? There's loads of good advice on the internet that GPT (and other LLMs) plagerized to explain well enough principles and practices. But due to how often they lie convincingly as a new dev you don't know what you don't know, so you might assume it knows better than you and think something is right when it's wrong. Also, there's so much bad code out there. Most of the code LLMs are trained on are not production ready. Think of every YouTuber who makes a small proof-of-concept code to showcase "how does X work?" but doesn't put enough effort into it to use the code snippet in production. GPT et al don't know that it's only partially correct code, it just says it's right. As an example, a few weeks ago I asked GPT to help me with an XML parser for Nodejs. It told me it was super easy, just use the DOMParser API. Which is a browser API and not implemented in Nodejs. But it was so sure it would work that even when I told it that API was browser only it still tried to tell me it was what I wanted. And that's another problem with JS specifically. There are several flavours of JS—browser, Nodejs, Bun, Deno, CloudFlair Workers, etc—that LLMs try to tell you to use an API that's not available for the platform you're using.
NugByte
NugByteOP4mo ago
I love this community so much ❤️ and im talking teaching only , not even showing a single line of code of the concepts i want to learn and memorize

Did you find this page helpful?