Okey that definitely explains it
Okey that definitely explains it
Banned for being linked to Russian state, or for being Russian? Lol those are very very different
I am mostly complaining about his writing style. Obviously the subject itself is interesting (to some people)
I wouldn’t bet my eye on it, but who knows!.. Maybe he was a better teacher before!
Again, Knuth himself said in a preface that Volumes 2 through 5 are independent.
That sounds interesting, will take a look. I am not against theoretical computer science, i just think Knuth doesn’t reads like a good teacher…
Because volume 1 is not available in the library
Edit: but also the volumes aren’t not dependent on each other. They treat very different topics, i doubt reading Volume 1 will help with Volume 4.
I don’t understand why this is called a “subset”, while clearly containing new syntax
A subset would be understood by older compilers, this is a superset
Absolutely not a replacement to VBA. Not even close. As usual, Microsoft hypes something everyone wants, and then implements something nobody asked
I feel offended by you somehow equalizing perl and lisp
This a much better done meme
The other one before makes zero sense
2100 parameters is a documented ODBC limitation( which applies on all statements in a batch)
This means that a
“insert into (c1, c2) values (?,?), (?,?)…” can only have 2100 bound parameters, and has nothing to do with code, and even less that surrounding code is “spaghetti”
The tables ARE normalised, the fact that there are 50 colums is because underlying market - data calibration functions expects dozens of parameters, and returns back dozens of other results, such as volatility, implied durations, forward duration and more
The amount of immaturity, inexperience, and ignorance coming from 2 people here is astounding
Blocked
You should take a break from trolling
I timed the transaction and opening of the connection, it takes maybe a 100 milliseconds, absolutely doesn’t explain ghe abysmal performance
Transaction is needed because 2 tables are touched, i don’t want to deal with partially inserted data
Cannot share the code, but it’s python calling .NET through “clr”, and using SqlBulkCopy
What do you suggest i shouldn’t be using that? It’s either a prepared query, with thousands of parameters, or a plain text string with parameters inside (which admittedly, i didn’t try, might be faster lol)
Will try bcp & report back EDIT: I can’t install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.
I will try bcp. Somehow, i was convinced I had to have access to the machine running the sql server to use it, but from the doca i see i can specify a remote host… Will report back! EDIT: I can’t install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.
Please enlighten us? You barely know anything about the system or usage, and you have deduced nosql is better? Lol
I am using SqlBulkInsert, given how bad MS is with naming things, that might as well be row inserts instead of bulks
Oh buddy, enjoy your life & don’t touch Microsoft even with a 10 meters stick
Don’t write “if” in your tests! It makes very, very little sense: how is that, you test your application and you are unsure what is the resulting outcome of a call? Is it depending on arguments? Then fix the argument, and expect 1 specific result. Is it depending on environment? Fix/mock the environment.
No “ifs” in the tests!