I am mostly complaining about his writing style. Obviously the subject itself is interesting (to some people)
I am mostly complaining about his writing style. Obviously the subject itself is interesting (to some people)
I wouldn’t bet my eye on it, but who knows!.. Maybe he was a better teacher before!
Again, Knuth himself said in a preface that Volumes 2 through 5 are independent.
That sounds interesting, will take a look. I am not against theoretical computer science, i just think Knuth doesn’t reads like a good teacher…
Because volume 1 is not available in the library
Edit: but also the volumes aren’t not dependent on each other. They treat very different topics, i doubt reading Volume 1 will help with Volume 4.
I feel offended by you somehow equalizing perl and lisp
This a much better done meme
The other one before makes zero sense
2100 parameters is a documented ODBC limitation( which applies on all statements in a batch)
This means that a
“insert into (c1, c2) values (?,?), (?,?)…” can only have 2100 bound parameters, and has nothing to do with code, and even less that surrounding code is “spaghetti”
The tables ARE normalised, the fact that there are 50 colums is because underlying market - data calibration functions expects dozens of parameters, and returns back dozens of other results, such as volatility, implied durations, forward duration and more
The amount of immaturity, inexperience, and ignorance coming from 2 people here is astounding
Blocked
You should take a break from trolling
I timed the transaction and opening of the connection, it takes maybe a 100 milliseconds, absolutely doesn’t explain ghe abysmal performance
Transaction is needed because 2 tables are touched, i don’t want to deal with partially inserted data
Cannot share the code, but it’s python calling .NET through “clr”, and using SqlBulkCopy
What do you suggest i shouldn’t be using that? It’s either a prepared query, with thousands of parameters, or a plain text string with parameters inside (which admittedly, i didn’t try, might be faster lol)
I will try bcp. Somehow, i was convinced I had to have access to the machine running the sql server to use it, but from the doca i see i can specify a remote host… Will report back! EDIT: I can’t install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.
Please enlighten us? You barely know anything about the system or usage, and you have deduced nosql is better? Lol
I am using SqlBulkInsert, given how bad MS is with naming things, that might as well be row inserts instead of bulks
Oh buddy, enjoy your life & don’t touch Microsoft even with a 10 meters stick
So they posted that screenshot before even trying to run it on some useless file to see it works… Internet points are surely a drug
React +python + postgres/sqlite
We have a solution
Your last question is equivalent to : why there so many math theories? Can’t we just reuse the old ones?
New language appear as a natural product from research in type theory for ex
Even something as ubiquitous as JSON is not handled in the same way in different databases, same goes for Dates, and UUID. I am not even mentioning migrations scripts. As soon as you start writing raw SQL, I pretty sure you will hit a compatibility issue.
I was specifically talking about python, can’t argue with golang. OK you have a valid point for performance, gotta keep an eye on that. However, I am satisfied for our CRUD api
You re not stupid, python’s packaging & versionning is PITA. as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem