Meanwhile over in the mechanical engineering department, someone is complaining that they have to learn physics when they just wanted to build cool cars.
...then don't study computer science. I study CS and it's annoying when someone in a more math/logic oriented course is like "If I get a job at a tech company I won't need this". All that IS computer science, if you just wanna code, learn to code.
The problem is a lot of people who want to learn to code, and are conditioned to desire the college route of education, don't actually know that there is a difference and that you can be completely self-taught in the field without ever stepping foot in a university.
I always wanted to believe this, but, at least in my country, not even a specialized high school degree is enough to get me anywhere for months, it's crazy.
Maybe you could even make it without formal education, but everyone's always looking for those sweet 3+ years of experience in the field (ಥ﹏ಥ)
We're not closing schools despite having libraries and the internet, having (good) teachers is useful to learn faster and get pushed further. There are some good programming schools that can make it more efficient for you. I think the main problem is rather the insane cost of higher education in the USA which create anxiety about being certain that you can repay it in the future it may open for you. It is sad.
It's harder to break into but I make 150k and barely graduated high school. Software engineering is largely a field that doesn't care about degrees but about ability. It's harder these days to break into the field than it was 10 years ago when I did but it's absolutely still possible
I've never been to college and my job title today is Software Architect, I've been doing this for nearly 20 years.
It was extremely hard at first to get a job because everyone wanted a BA, but that was also 20 years ago. Once I had some experience and could clearly demonstrate my capabilities they were more open to hiring me. The thing a degree shows is that you have some level of experience and commitment, but the reality is a BA in CompSci doesn't actually prepare you for the reality of 99% of software development.
I think most companies these days have come to realize this. Unless you're trying to apply to one of the FANG corps (or whatever the acronym is now) you'll be just fine if you have a decent portfolio and can demonstrate an understanding of the fundamentals.
I used to work at a small tech company (5-10 employees) and when we hired for entry level coders we’d receive hundreds of applications. Most of them would be grads from bootcamps, some with undergraduate degrees and some without. My boss would just throw out any that didn’t have a bs in something, but preferred a stem degree. He knew they didn’t need a degree, he knew you didn’t need actual coding experience, it was just a quick (maybe illegal) way to make that list of applications more manageable.
Edit: as other people have said - after your first job you are basically “in” and are a very desirable candidate. Your education matters much less after your first job.
I would have done CS if every math class at my school didn't have 500 people in it. Even college algebra. They basically made everything a weed-out class
I do think many of the CS concepts are pretty cool :)
Well what i felt working at a tech company that there are instances where we run into specific problems those may need to devise an algo ,and most of my non computer science peers fail to understand why!!
The point of these lectures is mostly not to teach how to work with Turing machines, it is to understand the theoretical limits of computers. The Turing machine is just a simple to describe and well-studied tool used to explore that.
For example, are there things there that cannot be computed on a computer, no matter for how long it computes? What about if the computer is able to make guesses along the way, can it compute more? Because of this comic, no — it would only be a lot faster.
Arguably, many programmers can do their job even without knowing any of that. But it certainly helps with seeing the big picture.
Arguably, a much more important thing for the students to learn is the limits of humans. The limits of the computer will never be a problem for 99% of these students or they'll just learn on the job the types of problems they're good at solving and the ones that aren't.
I didn't go to university, because I wanted to learn useful stuff, but because I'm curiousity driven. There is so much cool stuff and it's very cool to learn it. That's the point of university that it prepares you for a scientific career where the ultimate goal is knowledge not profit maximisation (super idealistically).
Talking about Turing Machines it's such a fun concept. People use this to build computers out of everything - like really - it became a Sport by this point. When the last Zelda was Released the first question for many was, if they can build a computer inside it.
Does it serve a practical purpose? At the end of the day 99% of the time the answer will be no, we have computing machines built from transistors that are the fastest we know of, lets just use these.
But 1% of the time people recognize something useful... hey we now found out in principle one can build computers from quantum particles... we found an algorithm that could beat classical computers in a certain task... we found a way to actually do this in reality, but it's more proof of concept (15 = 5×3)... and so on
About 15 years on, I'm still so happy I got good coursework marks for the route-finding equivalent of a bogosort. Picked a bunch of random routes and pick the fastest. Sure, that guy who set up a neural net to figure it out did well, but mine didn't take days of training, and still did about as well in the same sort of execution time.
But you can make games that much more interesting if your algorithms are on point.
Otherwise it's all "well I don't know why it generated map that's insane". Or "well AI has this weird bug but I don't understand where it's coming from".
I’m grateful to this strip because reading it caused me to learn the correct spelling of “abstruse”. I’ve never heard anyone say the word, and for some reason I had always read it as “abtruse”, without the first S.
I did games technology at university. We had a module that was just playing board games and eventually making one. Also did an unreal engine module that ended with making a game and a cinematic.
I never really understood the point of Lambda calculus. Why have an anonymous function? I thought it was good practice to meticulously segment code into functions and subroutines and call them as needed, rather than have some psuedo-function embedded somewhere.
I think you're confusing lambdas with lambda calculus. Lambda calculus is more than just anonymous functions.
To put it extremely simply, let's just say functional programming (the implementation of lambda calculus) is code with functions as data and without shared mutable state (or side effects).
The first one increases expressiveness tremendously, the second one increases safety and optimization. Of course, you don't need to write anonymous functions in a functional language if you don't want to.
As for why those "pseudo-functions" are useful, you're probably thinking of closures, which capture state from the context they are defined in. That is pretty useful. But it's not the whole reason lambda calculus exists.
See the other comments about lambdas vs. lambda calculus, but lambdas are supposed to be for incredibly simple tasks that don't need a full function definition, things that could be done in a line or two, like simple comparisons or calling another function. This is most useful for abstractions like list filtering, mapping, folding/reducing, etc. where you usually don't need a very advanced function call.
I was always taught in classes that if your lambda needs more than just the return statement, it should probably be its own function.
"Introduction to the Theory of Computation" by Michael Sipser, a book commonly referred to as simply "Sipser". My ToC course in uni was based around that book and while I didn't read the whole thing I enjoyed it a ton.
"Introduction to the Theory of Computation" by Michael Sipser, a book commonly referred to as simply "Sipser". My ToC course in uni was based around that book and while I didn't read the whole thing I enjoyed it a ton.
I read it cover-to-cover like fifteen years ago. I've lost most of that knowledge since I haven't touched it in so long, but I remember I really enjoyed it.
Hm, I wonder if I could make these students more miserable by introducing a CPU that permits static operation, then clocking that with a true random number generator?
So now it has output that is deterministic from the standpoint of the CPU but nondeterministic to an outside observer. Probably wouldn't affect the O(n) notation though, come to think of it. It would be funny though.
Thanks for the fun rabbit hole. They can't really solve the halting problem though, you can make an oracle solve the halting problem for a turning machine but not for itself. Then of course you can make another oracle machine that solves the halting problem for that oracle machine, and so on and so forth, but an oracle machine can never solve its own halting problem.