If you’re doing C++ then C++ Weekly by Jason Turner is an awesome must-watch.
Yoko, Shinobu ni, eto… 🤔
עַם יִשְׂרָאֵל חַי Slava Ukraini 🇺🇦 ❤️ 🇮🇱
If you’re doing C++ then C++ Weekly by Jason Turner is an awesome must-watch.
My bad, I’ll move there then
Pascal is so incredibly good and simple that I was writing programs (sort of, half of the time it was gibberish) in it when I was 7 years old and what helped a lot was that at the time Turbo Pascal came with lots of cool examples/tutorials, so you could just play around with code snippets until you figure it out on your own. Those who witnessed how programming can be taught today to 7~10 year olds using JS or Python might relate, Pascal was just that simple and clean.
Delphi was also amazing, it had the same simplicity of Visual Basic to make GUI apps while featuring a much better and more rigourous language.
The court documents state that Brody’s employment was terminated after he violated company policies by connecting a USB drive containing pornography to company computers.
Okay, I mean, there are many types of stupid but that’s a completely new one.
The OS won’t matter much in the beginning, though it helps that you’re already using Linux as you likely already have Python and GCC installed.
I don’t think you need a better PC than what you already have if the only goal is to learn programming, so I’d spend that money on something else.
I’d suggest you go through Harvard’s CS50 if you’ve never been exposed to computer science before: https://www.harvardonline.harvard.edu/course/cs50-introduction-computer-science . You can audit it for free, you don’t really need to pay for the certificate (which IMO doesn’t have much value at that level anyway).
Also, try to get into a computer science degree if you want to do that as a career, bootcamps and MOOCs are nice additions but will never replace a real degree.
Roughly one year. No GPU code however for that project as the target library is CPU-only anyway so not really comparable to PyTorch (and PyTorch is more than just the autodiff), but there was lots of SIMD vectorization. Yeah you could train a neural network on CPU with it if you want, and the expression template stuff I talked about would be somewhat equivalent to PyTorch’s operator fusion, but the target use is more quant finance code.
Automatic differentiation in C++17, had to do it from scratch as they weren’t open to 3rd-party libraries and it had to integrate with many symbolic calculations that are also done in the library (so even if you used a 3rd-party library you’d still have to rewrite parts of it to deal with the symbolic stuff). Succeeded in doing it and basically the performance was on par with what people in the industry would expect (all derivatives at once in around 3 times the evaluation time, sometimes much less if the calculation code has no dynamic parts and is differentiated entirely at compile-time).
It was pretty cool because it was a fun opportunity to really abuse template meta-programming and especially expression templates (you’re basically sort of building expression trees at compile-time), compile-time lazy evaluation, static polymorphism and lots of SFINAE dark magic and play around with custom memory allocators.
Then you get scolded by the CI guys because your template nightmare makes the build times 3x slower, so the project then also becomes an occasion to try out a bunch of tricks to speed up compilation.
Since it’s running mypyc
(https://mypyc.readthedocs.io/en/latest/index.html ) on the imports under the hood, I’m struggling to see why you’d want to limit yourself to the standard library instead of just compiling your whole program as a module with mypyc
directly?
At the same time, if the gains are consistent, I also feel that CPython should just compile an entire standard library on installation.
EDIT: benchmarks for mypyc
look impressive, only exceptions are slower:
It’s because of marketing & blogging “gurus” giving advice like this: https://copyblogger.com/10-sure-fire-headline-formulas-that-work/
The goal is for you to… click the headline when you see it in Google search results or elsewhere, so that they can then convert your visit to ad revenue when you see their ads and/or click on them or even buy stuff through their affiliate links (mostly the case when they link you to a product on Amazon) in which case they’ll earn a commission.
Have developers be more mindful of the e-waste they’re contributing to by indirectly deprecating CPUs when they skip over portions of their code and say “nah it isn’t worth it to optimize that thing + everyone today should have a X cores CPU/Y GB of RAM anyway”. Complacency like that is what leads software that is as simple in functionality as equivalent software was one or two decades ago to be 10 times more demanding today.
I don’t see the point of the video. The whole thing is just him looking at a screen.
Linear and logistic regression are much easier (and less error prone) to implement from scratch than neural network training with backpropagation.
That way you can still follow the progression I suggested: implement those regressions by hand using numpy -> compare against (and appreciate) sklearn -> implement SVMs by hand using cvxpy -> appreciate sklearn again.
If you get the hang of “classical” ML, then deep learning becomes easy as it’s still machine learning, just with more complicated models and no closed-form solutions.
I’d say since you’re a beginner, it’s much better to try to implement your regression functions and any necessary helper functions (train/test split etc…) yourself in the beginning. Learn the necessary linear algebra and quadratic programming and try to implement linear regression, logistic regression and SVMs using only numpy
and cvxpy
.
Once you get the hang of it, you can jump straight into sklearn
and be confident that you understand sort of what those “blackboxes” really do and that will also help you a lot with troubleshooting.
For neural networks and deep learning, pytorch
is imposing itself as an industry standard right now. Look up “adjoint automatic differentiation” (“backpropagation” doesn’t do it any justice as pytorch
instead implements a very general dynamic AAD) and you’ll understand the “magic” behind the gradients that pytorch
gives you. Karpathy’s YouTube tutorials are really good to get an intro to AAD/autodiff in the context of deep learning.
Reimplementing stuff from scratch, overengineering and, if you’re coding in a compiled language, knowing a bit of assembly to be able to make better performance-related decisions.
EDIT to clarify the overengineering part: the idea is not to do it at work obviously because there you will have to meet deadlines and you will have to spend time on what’s most important, but the idea instead is to do that when working on your own personal projects. Try to make every component of your project as “perfect” as possible as if you were trying to rebuild a Mercedes-AMG car. That’s how you’ll learn a bunch of new tricks.
Lemmy is probably a good live example of how sometimes going for a “faster language” like Rust isn’t going to magically make a bad SQL database design better or slow queries faster: https://github.com/LemmyNet/lemmy/issues/2877
So yeah, SQL optimization stories are definitely welcome too!
Removed by mod
deleted by creator
Congrats on still rocking a laptop that’s 10 years old!
Since you already know Java, you could jump straight to C++ with Bjarne’s book “Programming - Principles and Practice Using C++”: https://www.stroustrup.com/programming.html
You can then move to more modern C++ with his other book “A Tour of C++”: https://www.stroustrup.com/tour3.html
And then if you’re curious to know how software design is done in modern C++, even if you already know classical design patterns from your Java experience, you should get Klaus Iglberger’s book: https://www.oreilly.com/library/view/c-software-design/9781098113155/
In parallel also watch the “Back to Basics” video series by CppCon (see their YouTube channel: https://www.youtube.com/@CppCon , just type “back to basics” in that channel’s search bar).
Learning proper C++ should give you a much better understanding of the hardware while the syntax still remains elegant, and you get to add a new skill that’s in very high demand.