How do computers read code?
When you first learned to write code, you probably realized that computers don’t really have any common sense. You need to tell a computer exactly what you want. But do you know about all the work the computer does to understand what you mean?
Twitter: https://twitter.com/frameofessence
Facebook: https://www.facebook.com/frameofessence
YouTube: https://www.youtube.com/user/frameofessence
Video links:
Crash Course Computer Science:
Building the Bits and Qubits
Tools used:
gdb
gcc
Monospaced font:
Menlo-Regular
Images and other visuals:
The IDE in the intro:
Eclipse
Python scripting:
IDLE
Source code distribution example:
Apache httpd on GitHub
Executable distribution examples:
Audacity
VLC media player
Blender
Punch cards:
https://en.wikipedia.org/wiki/File:FortranCardPROJ039.agr.jpg
https://commons.wikimedia.org/wiki/File:Punched_card_program_deck.agr.jpg
Early computers:
https://en.wikipedia.org/wiki/File:BRL61-IBM_702.jpg
https://en.wikipedia.org/wiki/File:IBM_701console.jpg
Complex history of computer languages:
https://en.wikipedia.org/wiki/Generational_list_of_programming_languages
Montage:
Sublime Text
IntelliJ IDEA
https://www.haskell.org/
IntelliJ IDEA again…
Print “Hello, world!” command:
Python shell
Music:
YouTube audio library:
Sunflower
Incompetech:
Call to Adventure
If I Had a Chicken
Premium Beat:
Cutting Edge Technology
Second Time Around
Swoosh 1 sound effect came from here:
http://soundbible.com/682-Swoosh-1.html
…and is under this license:
https://creativecommons.org/licenses/sampling+/1.0/
Amazing and comprehensive!
Usually I spend a week to explain all of this to my students (w more details oc), but this is so helpful!
*laughs in already just programming in assembly*
Love it…
Please make more videos 🙁
This video is pure knowledge !
Any computer class worth its salt would go through compilers
Непогано пояснив, лайк тобі👏👏👏
Only 5 mins, then I started to see comments
Frame of Essence’s “Frame of Essence” only happens once a year. Hello from 2019, my friend.
So the thing that you did in that extra video to make the video making process go smoothly is make 1 video a year instead of 2?
Awesome video
I can’t fucking tell is this Java or C#
"All you did was Python scripting!?" lmao
这是我最过的最好的、最轻松的、最好玩的软件介绍视频!没有之一!!!
dood i feel like worthless lol
i cant do C yet, or anything like that. (though i can sort of follow it when having to tackle the firmware on the 3d printers…)
but i do have the advantage that i learnt about logic gates and binary arithmetic on discrete components before i started learning assembly for pic chips.
so the whole system makes perfect sense now.
divison and subtraction are hard operations which involve a few more steps(except for halving, which is a simple shift right). addition and multiplication is easy. doubling especially so (shift left)
all the compiler does is turn our language into hexadecimal units, which is then fed into the chip as binary. all instructions, data, and variables are binary. just a bunch of switches in parallel basically.+
and because i work in assembly, and take note of the hex dumps, i can code in binary…
i draw the line at 8 bits though!
i gotta learn C. for most purposes its probably easier. its pretty easy to get lost with gotos and skips…
but when the program is bordering on the limit of ROM… assembly takes longer but is usually far more efficient.
What if my code identifies as non-binary?
Great video!
*Lisp* is real pain
👏👏👏
🤣🤣🤣🤣🤣
BTW you can make a compiler using raw assembly or (in my case) switch bus (I made a home-brew computer and entered in programs by using a 0-9 A-D * (E) and # (F) keypad and you just use a switch to change modes. (memAddr and memSet))
really good video! explained perfectly
Wunderbau!
My Head Hurts 😭😂
No one programmer knows how to make me understand HOW REALLY THE ABSTRACTION LAYER WORKS. How compiler translate? I mean, how compiler knows 3 is 00000011? I guess there will be a table who tells compiler even the most easy instruction what really is. When you put 3 in your source code, really 00000011 is alocated in some place and compiler read this? Is the ASCII code? I mean, if a type "3" in the keyboard, in C or Phyton, is is really stored like 00000011? Cause no matter what low-level a program be, sometime the code are converted to a electrical signal who open or close logical gates. Even so, 0 and 1, they still are a representation for off/on, closed/open, etc.. Making several google searches, i read about microcode. Is it actually when the "magic" happens? Maybe the instruction set architecture? And again…. how text (although binary) like 00000001 or 00001011 is understanded by the processor? I´m not english speaker, I hope you understand what I mean and I expect someone explain my doubts clearly, if possible xD
damn how did you get to know my mind question at the starting point?
I haven’t gone trough any classes.
I’ve only watched youtube videos.
And half way finished Windows 8.1 app: Solo Learn C++
And I can’t remember anything about it.
🙂
the fact you did not do x++ and did x=x+1 hurts my soul
Soon Ai will be creating code leaving all the programmers jobless .
You should have like 100 million subscribers
A problem has been detected and Windows has been shut down to prevent damage to your computer.
111101000100010110001011
Is it wrong that I still write in a programming language that’s 35 years old? 😀
попробуйте написать на ассемблере программу для микроконтроллера AVR и вы увидите что такое status register, cerry flag и т.д. . Вы узнаете что такое таймеры, стек, макросы. Хотя… не надо, просто продолжайте-> System.out.println("Hello world") и Ваши идеи станут двигателем прогресса.
P.S. но кто-кто должен написать компилятор для Вас…
this video made me that I’m nothing
Which assembly was in the video? 8086?
Python is better then C any day
But how does the compiler know, how to convert the assembly–instructions, to hexadecimal?
this is FUNNY!
Even when I write "Hello world" there are 20 errors
no one
my brain: where am I ..? why am I????whoo am I !!????
Well I liked this channel
Why 0 and 1 , why not 0 , 1, and 2 for quantum computers, 2 can be 0 and 1 at the same time :0
My days okey, and how is your day going?
I opened a 32-bit .asm file and the first 10 characters were ╩P@_AS÷∩¿@
Imagine doing all the holes for old computer programs and then you realize you forgot a semicolon….
CPU uses a fetch execute cycle. It fetches an instruction from memory, decodes that instruction, and executes the instruction
Source: Tom Scott, god of the universe
I’ve always been curious about that fact.
When you write a code, how does the computer know that if statements mean so and so. Now I know, cool
I recently though about something that this video touched upon kinda. So if programs such as compilers were written in entirely machine code, even assembly, and the building of transistors for processors that run the robots that build the processes, how awful would it be, and how far would we be set back if we somehow (in a somewhat likely and pending scenario) lose the code and programs that allow us to compile programs…. Even if were to live on another world, we would likely have to take all this knowledge and programs with us, and possibly even a hardware manufacturing robot to even make the hardware for the factories that make us hardware to make the civilisation reasonably advanced in comparison to Earth. Kinda scary how easily we could be set back give a nuclear or digital Armageddon.
0:01
I never got to programming class tho