Yup this, if you would like more help we need the code, or at least a minimal viable reproduction scenario.
Yup this, if you would like more help we need the code, or at least a minimal viable reproduction scenario.
I might be open to the idea, but it would need to be a trustworthy company that doesn’t cancel stuff left and right. An ide would be too annoying to switch constantly to take this risk.
(What do (you) mean? ( Lisp certainly has its (downsides) and (upsides)))
This seems more focused on commercial license holders. here paying for your ide is not that uncommon, and also the amount of revenue to be gained is a lot higher. That being said I always found it a bit weird that jetbrains didnt make clion free for non commercial use as they did with pycharm/intelij.
Started only 2 years ago, so have some ways to go.
I’ll be starting with the andrej karpathy neural network series. Might not be reading per se, but I find it high time I actually go through and learn fully how each part of a neural net works together, instead of focusing only on small parts.
I have found it nice to use for large types (nested containers, lambdas) which are only used once, and I would not necessarily want a typedef. However I also dont like using it too much its basically trading up coding speed for reading speed. And tile and time again it has been found that the latter one is done a lot more.
Those certainly also look nice, did not notice those
I have been out of the ml world for a bit (like 6months lol …) And I already feek way out if date. It seems like I should pick up the vicuna llm, didnt want to touch llama initially due to the legal problems with it. I thought that would be a problen for a while, and then they went and solved it. Somehow even missed the news of it, most likely due to the enormous amount of news comming from the ml world (I might need a model to abbreviate it). Anyways thanks for the article I know what to do this weekend.
The second part has some of this, but not as in depth as i’d like.
Back in the day before university (around 6 years ago) I got recommended a mooc(massive open online course) by the university of Helsinki. I used this course to get started with learning to program, and to find out whether it was something for me. It has been some time, and it seems they update the course but I hope it can help you too in learning. Here is the link: https://java-programming.mooc.fi/. It really starts from 0, with setting up te environment which is nice. It is in java using the netbeans ide which some would call antique, but in my opinion that does not really matter to start to learn.
The most interesting part here I find is the cost analysis. Was quite surprised to see that the cost to train it on current hardware would have been a third of the cost it was back when they were training it. That is like a 3x improvement in a year/year and a half. I winder whether this trend will continue.
I wonder whether there even is something to deal with this collapse. It just seems to me that data generated by the model itself will never have any new cases to learn from. I think it will just keep reinforcing the current knowledge already in the model. This is not always a bad thing, we have been doing that(the reinforcing) for so long now under “epochs”. But even with epochs you end up overfitting after a while
Ended up using these quite a bit during my thesis, they are great for certain types of data. I read a lot of papers claiming these could a solution to the causality problem. Hence why you often see papers using GNNs named something something causal something. The Technical implementation however always felt a bit lackluster. The passing algorithms are at best 1 extra pass, but often were implemented as an extra function per link, scaling them quadratically per node. This was always a bit sad as it prevented large node GNNs. Havent looked at them the last years, might need to do a new pass myself to see if they have evolved a bit. The article is quite good at introducing it, but does indeed miss the performance problem here a bit :)
On the other hand, starting to learn latex through some smaller paper might prepare you for when you actually would need the features it brings during a thesis or something similar. Also nowadays with overleaf I do find that it is not all too difficult to get started with latex. Still it remains a bit of a complex beast to master.
I’d say take the latest stable one, which atm is 4.0.3. they released their major rewrite(version 4) a few months ago, but for now they still support version 3. Considering you are starting from scratch i’d say just go for 4. I have never used their tutorials myself (went about with only the public docs, and looking at other projects), but they have an entire page dedicated to it https://docs.godotengine.org/en/stable/community/tutorials.html. Feel free to take any one there.
Considering you have a low end pc i’d recommend trying godot. As someone who has been in the gamejam scenes for few years now I have seen it be used more and more. It is not the most powerful engine, especially compared to unity and unreal. It however is by far the easiest both on user experience and on computer resources. As a bonus it is fully free and open source, which is always nice. For the learning part I’d recommend just starting, being bad at something is the first step in being kinda good at something (this is a quote from somewhere, and i dont remember from where). Good luck!
It seems like the servers are having some trouble [https://www.redditstatus.com/]. Couldnt have chosen a beter time
This tool doesnt, but the one linked in the second edit does
I still am not sure what to think of this entire thing. It feels that at a certain point someone started playing some circus music, and they forgot to turn it off.