flat assembler
Message board for the users of flat assembler.

Index > Heap > Premature Optimization


What do you think?
I agree.
12%
 12%  [ 1 ]
I disagree.
37%
 37%  [ 3 ]
I don't care.
37%
 37%  [ 3 ]
I'm too drunk to decide/understand.
12%
 12%  [ 1 ]
Total Votes : 8

Author
Thread Post new topic Reply to topic
Tyler



Joined: 19 Nov 2009
Posts: 1216
Location: NC, USA
Tyler
Donald Knuth wrote:
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.
For anyone who doesn't know, Knuth wrote the The Art of Computer Programming, the 4 volume computer science bible.
Post 28 Mar 2015, 03:39
View user's profile Send private message Reply with quote
HaHaAnonymous



Joined: 02 Dec 2012
Posts: 1180
Location: Unknown
HaHaAnonymous
Interesting name for the book (though I am not good with arts). Did not know about him as well, I never bought any book related to programming because: I am very poor, my country is not so nice about importing goods (especially from USA), local programming books are scarce and when you find them, they are not in English and usually covers a specific programming language instead of programming itself. D:

As said above, you can consider me ignorant and you may not want to pay much attention to what I going to say below, continue at your own risk...



---
First, what you consider small efficiencies? Second, what do you really mean by "Premature Optimization"?

Is "premature optimization" the optimization you apply when the code is not yet done and/or tested?
Is "premature optimization" the optimization you apply first to the most insignificant areas of the code instead of more important ones?

Or is this another kind of topic: "Code it first, optimize it later"? I guess I am bit lost. D:

I apologize for any inconveniences I may have caused.
Post 28 Mar 2015, 04:11
View user's profile Send private message Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 17278
Location: In your JS exploiting you and your system
revolution
I don't think it is the "root of all evil". But it can certainly make a programmer less efficient at producing working code. And perhaps sometimes can be the cause of abandoning a project.
Post 28 Mar 2015, 04:13
View user's profile Send private message Visit poster's website Reply with quote
HaHaAnonymous



Joined: 02 Dec 2012
Posts: 1180
Location: Unknown
HaHaAnonymous
Quote:

And perhaps sometimes can be the cause of abandoning a project.

This sounds interesting... Not the abandon of the project, but its cause.

I usually gave up a personal project when I suddenly lost interest in it or when I think it cannot be useful to invest time on. I HAVE A PILE OF THEM (9 is what I could count right now)! D:

And there are the people who abandon a project because of the lack of users (see the proprietary software world, skype, msn, wmp, internet explorer, winamp, truecrypt all dead).

And there are the people who abandon a project because of a bug they could not solve in the last 48 hours. D:
Post 28 Mar 2015, 04:31
View user's profile Send private message Reply with quote
redsock



Joined: 09 Oct 2009
Posts: 357
Location: Australia
redsock
Pretty sure all of the top supercar manufacturers design every single component from the ground-up with "optimization" (aka performance) in mind. Spose for programming it largely depends on what it is you are building, but I can't categorically agree that you should ignore it altogether for 97% of the code.
Post 28 Mar 2015, 04:38
View user's profile Send private message Reply with quote
Tyler



Joined: 19 Nov 2009
Posts: 1216
Location: NC, USA
Tyler
HaHaAnonymous wrote:
Interesting name for the book (though I am not good with arts). Did not know about him as well, I never bought any book related to programming because: I am very poor, my country is not so nice about importing goods (especially from USA), local programming books are scarce and when you find them, they are not in English and usually covers a specific programming language instead of programming itself. D:
Here's the first 3 volumes. They're 700+ pages each, so you might want to download them overnight if your internet is slow.

http://www.itpa.lt/~acus/Knygos/Donald.E.Knuth%20-%20The%20Art%20of%20Computer%20Programming%20I-III,%20Concrete%20Mathematics,%20The%20Tex%20Book/Addison.Wesley.Donald.E.Knuth.The.Art.of.Computer.Programming.Volume.1.pdf

http://www.itpa.lt/~acus/Knygos/Donald.E.Knuth%20-%20The%20Art%20of%20Computer%20Programming%20I-III,%20Concrete%20Mathematics,%20The%20Tex%20Book/Addison.Wesley.Donald.E.Knuth.The.Art.of.Computer.Programming.Volume.2.pdf

http://www.itpa.lt/~acus/Knygos/Donald.E.Knuth%20-%20The%20Art%20of%20Computer%20Programming%20I-III,%20Concrete%20Mathematics,%20The%20Tex%20Book/Addison.Wesley.Donald.E.Knuth.The.Art.of.Computer.Programming.Volume.3.pdf

HaHaAnonymous wrote:
First, what you consider small efficiencies? Second, what do you really mean by "Premature Optimization"?

Is "premature optimization" the optimization you apply when the code is not yet done and/or tested?
Is "premature optimization" the optimization you apply first to the most insignificant areas of the code instead of more important ones?

Or is this another kind of topic: "Code it first, optimize it later"? I guess I am bit lost. D:
Yeah, that's basically it. The idea is to do profiling guided optimization. Write it, profile it, then optimize the parts that are actually taking the most time.
Post 28 Mar 2015, 05:16
View user's profile Send private message Reply with quote
JohnFound



Joined: 16 Jun 2003
Posts: 3500
Location: Bulgaria
JohnFound
As a rule, heavy optimized HLL code becomes not readable at all and very hard for support. That is why the HLL programmers should optimize as little of code as possible. Probably only the inner loops in the critical places of the program.

But in assembly programming, the optimizations, not always, but very often improve the readability of the code and makes it more easy for support. Such early optimizations can be very helpful and should not be avoided.

Another optimization that must be made as early as possible is the algorithm based optimizations. The properly chosen algorithm can improve performance much more than instruction level optimizations. But changing key algorithms on later stages of development can be very hard or even impossible.
Post 28 Mar 2015, 06:31
View user's profile Send private message Visit poster's website ICQ Number Reply with quote
AsmGuru62



Joined: 28 Jan 2004
Posts: 1409
Location: Toronto, Canada
AsmGuru62
I think some optimization must be done just when writing code, to avoid blunders like:
Code:
mov reg, 0
cmp reg, 0
etc.
    

Then, of course, after program is working, some review needed in places of heavy loaded loops, like label alignment in loops, etc.
Post 28 Mar 2015, 12:50
View user's profile Send private message Send e-mail Reply with quote
Tyler



Joined: 19 Nov 2009
Posts: 1216
Location: NC, USA
Tyler
John Found, I think you're probably right. Never thought of it like that (except the algorithm part).
Post 28 Mar 2015, 15:12
View user's profile Send private message Reply with quote
l_inc



Joined: 23 Oct 2009
Posts: 881
l_inc
Tyler, JohnFound

JohnFound is most certainly wrong on both points: on advantages of optimizing as little as possible in HLL and on improved readability of optimized assembly. Aggressively optimized assembly sucks much more than aggressively optimized HLL with respect to readability.

HaHaAnonymous posed a justified question regarding the definition of "premature optimization". And to define it (in a way that best emphasizes the disadvantages) premature optimization is an optimization that will affect your design decisions, which are either not certain or not even made yet. E.g., you pack bits into a structure, so that it fits your cache line and then you find out that your best algorithm needs to sequentially access a single field in every structure of an array of these. Or you optimize accesses to your hash table by meticulously seeking hashing and probing algorithms that best fit your data, and then you find out that you need frequent range searches over that data and hence a tree would be a much better container for it.

_________________
Faith is a superposition of knowledge and fallacy
Post 29 Mar 2015, 00:20
View user's profile Send private message Reply with quote
JohnFound



Joined: 16 Jun 2003
Posts: 3500
Location: Bulgaria
JohnFound
Quote:
Aggressively optimized assembly sucks much more than aggressively optimized HLL with respect to readability.

Maybe from HLL programmer point of view. Or as an exception. I wrote "not always, but very often" because of the exception cases that are always possible.

About your definition of premature optimization: Of course, optimizing the code that is not even designed properly is wrong, but I am not sure the "premature optimization" means exactly that. At least you can always define "premature optimization" in a way that make it wrong. But this is circular definition. Also see Сепулька. Wink
Post 29 Mar 2015, 05:36
View user's profile Send private message Visit poster's website ICQ Number Reply with quote
l_inc



Joined: 23 Oct 2009
Posts: 881
l_inc
JohnFound
Quote:
I wrote "not always, but very often" because of the exception cases that are always possible.

I could write the same thing, but replace "very often" with "very rarely" because of some exception cases... you know. Eventually we'd both imagine contradictory exception cases thinking those would be a rule.

A quite often occurring situation is when you need to inline functions. Say you have a short custom comparison function of strings in your source that is called from three different locations of your assembly program. To make it work faster you decide to inline the function into all these locations:
1) there's no more logically separated piece of a program
2) you remove local variables and arguments so that there are no more memory accesses, so all the processed data effectively becomes unnamed
3) to reduce the number of read-after-write hazards you reassign the data to different registers
4) additionally you reorder and spray the code of the inlined function across the code of your caller function to add more parallelism of instruction processing
5) you apply some context-dependent tricks such as putting -1 into a register by calling sbb reg,reg that is located in a jc branch
6) all of the above is done three times, each time differently
Epilogue) Oh, now you realize that the comparison function would better work differently (so much for the preliminary optimization).

Now compare that to the same aggressive optimization in C: you add the forceinline attribute to the function prototype. Far less readable, right? Wink

In fact if you support a generally followed intra-program calling convention of functions in your assembly, then you don't write aggressively optimized assembly.

Quote:
At least you can always define "premature optimization" in a way that make it wrong. But this is circular definition.

It's not. Firstly because the term "wrong" does not appear in the definition chain of "premature optimization", but is rather an informal estimation of something appearing in the chain. Secondly, because the definition of "wrong" would not rely on the term of "premature optimization". So it would not be an equivalence (which is essential for a definition), but an implication.

But nitpicking aside, many (lower-level) design decisions evolve while implementing a program, which means that you should not start optimizing before you have a more or less complete implementation that gives you a good overall impression of all your local design decisions. This however should not be confused with applying obviously stupid and inappropriately slow methods, and not following some general performance related guidelines (which is what you probably call "heavy optimization"). And when you really have that working implementation, then it's time to start optimizing it at those locations that induce the largest performance penalties.

Quote:
Also see Сепулька.

Bad example, because the definition circle is broken two times: there's no reliance on a term, but still a reference "see ...".

_________________
Faith is a superposition of knowledge and fallacy
Post 29 Mar 2015, 16:59
View user's profile Send private message Reply with quote
Sean4CC



Joined: 15 Apr 2015
Posts: 14
Sean4CC
That is a sound bite from someone who only wrote short algortihms. In a large system you will permanently bake in inefficencies.
Post 17 Apr 2015, 17:45
View user's profile Send private message Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi
revolution wrote:
I don't think it is the "root of all evil". But it can certainly make a programmer less efficient at producing working code. And perhaps sometimes can be the cause of abandoning a project.


Yep, it's usually more important to release as fast as possible something which works not perfectly perhaps but well enough to keep people happy enough for them to want to invest and contribute to the project so it can continue. The other option is to try to build something perfect and release it when it's ready which can either be a day which never comes or it comes too late for the software to be relevant anymore.

When thinking of building something one really should try to first write down a specification which takes a stand on what is good enough not just feature-wise but also in terms of performance and if it doesn't have to be high then one can use shortcuts to get things done faster as long as the end result conforms to the requirements.

Then again it might be a good idea to try to architect the code so that it will be easier to modify and thus to optimize later for example by making it modular enough to allow replacing components with better optimized ones.
Post 26 Apr 2015, 07:42
View user's profile Send private message Reply with quote
AsmGuru62



Joined: 28 Jan 2004
Posts: 1409
Location: Toronto, Canada
AsmGuru62
Very good analysis, nyrtzi!
I support a product at work.
Written in 1997-2000 - it still is profitable - clients like it, that is what matters.
So, we spent between last releases close to 1.5 years and that did not sit well with clients.
We tried to make it better with modules, optimizations, etc.
We learned the lesson - all that is good if clients are happy.
But if they're not - release what you have right now!
Smile
Post 26 Apr 2015, 15:58
View user's profile Send private message Send e-mail Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  


< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You can attach files in this forum
You can download files in this forum


Copyright © 1999-2020, Tomasz Grysztar.

Powered by rwasa.