Last Monday I had the pleasure to do a Flash Talk at London Web Standard, a monthly meetup made to discuss the state of the web, discuss and learn new technologies.
For this talk I brought one of the processes I mostly cherish – Role Switching, or how to walk one mile in the shoes of your fellow developer and understand his problems and difficulties, with the final goel to grow both as an engineer and as a human being.
Speaking from personal experience, I highly suggest you to try this process at least once EXPECIALLY if conflicts are common and tension is high in between members of your team.
I hope you will enjoy this presentation as much as I enjoyed creating it (video will be up soon)
As you can see here, the code is correct, but very little efficient.
To create the deck, we need to go 13 times for every element in the SUITES array.
13 * 4 = 52 iterations. The complexity is clearly O(n2).
Since we are working with a little set of data, this may be negligible, but still, is there a better way to write this code?
Presenting the ImprovedDeck class!
Deck class, second implementation
123456789101112131415
classImprovedDeckVALUES=["2","3","4","5","6","7","8","9","10","J","Q","K","A"]attr_accessor:cards#YAY! Much better O(N) initialization :-D definitialize@cards=[]VALUES.eachdo|value|@cards<<["#{value}C","#{value}D","#{value}H","#{value}S"]end@cards.flatten!endend
In the code above, we go through each element of the VALUES array only once, returning, in the end,the same deck as the first implementation. The tradeoff of this quickening is the face that we cannot input dynamically the SUITES inside the class, but this is not the case in which we have more than the regular set of suites.
Complexity now is O(n) , confirmed also by the following benchmarks:
Benchmarks
1234
(on running 1000 times the same method) user system total real
Deck 0.680000 0.020000 0.700000 ( 0.703846)Improved Deck 0.050000 0.000000 0.050000 ( 0.047859)
Of course, real-life cases are not so easy to solve, and sometimes an O(n2) complexity is more than acceptable for a certain problem, but i think is a good practice to keep your mind open to further optimize your code.
PS: 1 bazillion thanks to Tim Ruffles for his help and constructive criticism. I wish all the critics was like yours!
Today one of my tasks was to convert a pdf to a png of the same quality, but sadly, all my attempts were all low resolution.
So after a little research on the documentation i found out the density option, which by default is 72. PDF is around 300, so 200 is a good tradeoff (at least, for now).
Also, i found a little bit misguiding that in some guides , the density declaration is after the write function, while should be right after you read the file….by the way, here is a nice example! Enjoy!
Conversion from PDF to Image without quality loss.