flat assembler
Message board for the users of flat assembler.
 Home   FAQ   Search   Register 
 Profile   Log in to check your private messages   Log in 
flat assembler > Programming Language Design > why learn code that doesn't run?

Author
Thread Post new topic Reply to topic
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)

why learn code that doesn't run?

The relevant quote was from the Windows subforum and thread titled Learning Assembly:


C0deHer3tic wrote:

Why would I want to learn programming for code that doesn't even run in my environment? If there is a good reason, I am open to hear it.



I am not well-trained in Comp. Sci., so I hope someone here can explain better than I can. Corrections welcome. (BTW, this is a technical discussion, not a personal attack. Do I have to disclaim that? I am not mocking the OP, merely saying that apparently theory is [more?] important.)

Roughly speaking, AFAICT, the algorithm should come first, and be thought out before any half-hearted implementation. This (usually) means pseudo-code or close to it. Some languages (esp. in the ALGOL line) are more obvious than others for things like this. (See Wirth's _Programming in Oberon_ and _Algorithms and Data Structures (Oberon)_ too, both of which are updated versions of older texts.)

So it's not so much knowing what each discrete assembly instruction does, how it affects the flags, what functions are called, or what the calling conventions are. It's more about "why" than "how".

I don't have perfect references to quote, so I can only indirectly find some weak ones.

https://en.wikiquote.org/wiki/Edsger_W._Dijkstra


Quote:

The precious gift that this Turing Award acknowledges is Dijkstra's style: his approach to programming as a high, intellectual challenge; his eloquent insistence and practical demonstration that programs should be composed correct, not just debugged into correctness; and his illuminating perception of problems at the foundations of program design.
...
The difference between a computer programmer and a computer scientist is a job-title thing. Edsgar Dijkstra wants proudly to be called a "computer programmer," although he hasn't touched a computer now for some years.
...
I learned later that one of the advantages of designing without pencil and paper is that you are almost forced to avoid all avoidable complexities.



And yet I still say it's much harder to explain low-level assembly without a debugger. The debugger is your friend. You can't live without it. Then again, it is a crutch.


Quote:

If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with.



You can easily program in (some) HLLs without debuggers. (Perhaps not C.) But assembly and some others with weak (or no) typing will not help as much. In weakly-typed languages, it's easy to write "correct" code that compiles fine but doesn't at all do what you intended. (Well, this is a good reason not to do everything from scratch, use well-tested libs with printf so that you don't have to debug your own.)

Hey, everything has bugs, but the point is you should learn to know exactly what you're doing and why. Maybe that's why learning assembly can be beneficial. See Knuth's answer to Why have a machine language? (regarding his theoretical 64-bit assembly cpu, MMIX, used in his books).

Other references? Apparently Windows NT was intentionally portable from the start (with heavy emphasis by main dude, who previously worked on VMS, who later did lots of stuff to help transition to 64-bit). Even if some of those arches died, I think the lesson is that it still helped the codebase.

Apparently another language for PDP-11 was LIL, sitting somewhere between assembly and C. Yet it was later considered a failure because the others were more effective targets. (It might be partially true, but I still somewhat disagree with that assessment.)

And, of course, the Ten Commandments of C basically tell you that portability, error-checking, and defensive programming are better than sloppy, "ad hoc" hacks thrown together overnight.

Does this mean I hate assembly or that it's useless? Obviously not. But not everything has to be written in assembly (or C++17 or ...). The cheap adage is "use the right tool for the job". I guess I'm cynical and don't want to overemphasize x86, even if it's easily available everywhere (for now, despite ARM also being everywhere). Heck, FASMARM and FASMG pretty much embrace this multi-arch world. (But yes, I love "portable" cpu emulators!)

I'm pretty sure that real Comp. Sci. teaches you to learn general computing principles and not get caught up in fad of the week. Don't stick too tightly to any tech, it won't last. But good algorithms (and data structures?) are timeless (or at least less short-lived).

SUMMARY: Just to reiterate, learning or using assembly is fine. But just knowing about MOV and CMP/JZ and CALL/RET isn't much use by itself. HLLs are far from perfect but do indeed insulate you from all the irrelevant details. You're free to learn later, if desired, but usually it doesn't come to that.

P.S. Here's one (untested) link to some course lecture notes in Comp. Sci., see if it strikes any chords: Computer Architecture and Organization (and here).
Post 03 Apr 2017, 06:38
View user's profile Send private message Visit poster's website Reply with quote
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)

Re: why learn code that doesn't run?


rugxulo wrote:
Maybe that's why learning assembly can be beneficial. See Knuth's answer to Why have a machine language? (regarding his theoretical 64-bit assembly cpu, MMIX, used in his books).



I haven't studied this. Just to explain, it seems he designed a "simpler" (RISC-y) cpu for students to learn from. And, while no actual physical hardware, there are working simulators (debuggers?) for it and even a GCC port.

Of course, it's debatable whether you should learn "yet another" cpu (especially virtual) instead of just jumping in to x86. There are arguments that learning Esperanto then makes it easier to learn further languages. But you could also argue that learning E-o (however simplified) is time wasted that would be better spent on a more practical target language (like ... I don't know, German or Mandarin or whatever).

This is not an airtight discussion. I'm not exactly sure what I'm trying to express or understand. There is no perfect answer to solve all problems equally. I really don't want to discourage study of x86, but it's of limited use unless you already understand things at a higher level.

The simplest answer is that you should study your HLL compiler's (assembly) output, e.g. gcc -g -c -S -masm=intel (etc) and relevant library routine sources. For example, I have written a C version of printf("%ld") that is easier to understand than a full printf() implementation and even simpler than a raw assembly version. E.g. here, study this TinyStdio and its corresponding assembly output.
Post 03 Apr 2017, 07:00
View user's profile Send private message Visit poster's website Reply with quote
C0deHer3tic



Joined: 25 Mar 2017
Posts: 49

Wow. That was very interesting indeed! I appreciate you for voicing your opinion. And I am looking at your source.

_________________
- Just because something is taught one way, does not mean there is not a different way, possibly more efficient. -
Post 03 Apr 2017, 08:40
View user's profile Send private message Reply with quote
vivik



Joined: 29 Oct 2016
Posts: 190

It's easy to lose your grip of reality if you don't try things out as soon as possible. Use ollydbg and try things out as often as you can, especially if you are new to asm.

OP pretty much talks about bottom-up versus top-down development. "try things out immediatly" is bottom-up, and "plan ahead" is top-down. Both has their advantages and disadvantages. You should try to combine them.

I remember joking about "running assembly code in your head doesn't feel too good, but maybe it is ///". Some people are good at it, but I'd prefer to avoid it if possible. There are debuggers for that.This is probably the main reason why I suck at reading source code, I both don't use debuggers and can't execute code properly in my head.


Last edited by vivik on 28 Apr 2017, 12:50; edited 1 time in total
Post 22 Apr 2017, 09:13
View user's profile Send private message Reply with quote
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)


C0deHer3tic wrote:

rugxulo wrote:

For example, I have written a C version of printf("%ld") that is easier to understand than a full printf() implementation and even simpler than a raw assembly version. E.g. here, study this TinyStdio and its corresponding assembly output.


Wow. That was very interesting indeed! I appreciate you for voicing your opinion. And I am looking at your source.



FYI, I didn't literally mean that TinyStdio was mine, just that I had partially written my own as well. Modern printf is way overengineered, so it's not simple at all anymore (bloat!).

Going back to the Learning Assembly topic, there are several problems with display_number (mostly because it wasn't meant for teaching and is "easy enough" for experienced programmers to understand):


  • magic numbers -- it's poor style to not use meaningful names (compile-time constants), especially without comments (but, again, Tomasz is a pro, so he doesn't need it)
  • almost everything is done in cpu registers, not memory (variables), hence it's less clear what register holds what and why


10 is decimal base. 30h is '0' (ASCII zero). UINT_MAX for uint32_t should be (roughly) 4 billion (that's 4 followed by nine zeros). He's dividing by 1 billion in order to get the first digit of the number. If you divide by 10 (like I did in my own routine), you get the last digit instead. Honestly, it's a bit confusing without a debugger!

But another obvious tip is to not divide more than necessary (e.g. see ANSI C's ldiv, which does both div and mod at the same time, as often needed). This is probably why hex (base 16) output is so common, no (slow!) divide instructions needed!

So you really have to know what you want. It's not obvious at first glance.
Post 23 Apr 2017, 21:33
View user's profile Send private message Visit poster's website Reply with quote
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)


vivik wrote:
It's easy to lose your grip of reality if you don't try things out as soon as possible. Use ollydbg and try things out as often as you can, especially if you are new to asm.



In assembly, you're almost constantly living inside a debugger. Or at least that's my (limited) experience.

What (Windows) debuggers allow immediate instruction execution or even assembling? I'm only familiar with DOS ones, so that's not directly helpful for the OP. Those kind are much more helpful than forcing you to (tediously) quit, reassemble, start debug again, etc.


vivik wrote:

OP pretty much talks about bottom-up versus top-down development. "try things out immediatly" is bottom-up, and "plan ahead" is top-down. Both has their advantages and disadvantages. You should try to combine them.



I honestly wasn't trying to be dogmatic, just saying that "use DIV for divide" isn't very helpful advice by itself. You have to know when to divide and how.


vivik wrote:

I remember joking about "running assembly code in your head doesn't feel too good, but maybe it is ///". Some people are good at it, but I'd prefer to avoid it if possible. There are debuggers for that. This is probably the main reason why I suck at reading source code, I both don't use debuggers and can't execute code properly in my head.



Most things aren't obvious or are heavily overloaded with tons of arcane details. Not to mention bugs or too many dependencies. It's just not easy. I know some geniuses exist, but overall most of us don't write correct code the first time. Most code is either inefficient, convoluted, or buggy/wrong. (Most code needs a rewrite or heavy refactoring.)
Post 23 Apr 2017, 21:47
View user's profile Send private message Visit poster's website Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 15324
Location: Bigweld Industries

One thing I have noticed is that a significant portion of asm coders don't RTFM. And many of them don't even want to download TFM. You won't get far without doing those things. Intel and AMD provide the manuals for free. They are the canonical reference to use. No amount of asking in forums or reading other people's code will substitute for the actual manual. And yes I know it is often considered bad manners to direct people to a manual, but I think this perception is wrong. Asking about things like which flags are altered by a particular instruction is extremely bad manners and waste other people's time to answer such a simple question that can be self-answered immediately by looking in TFM.

I am an advocate of planning the code and algorithms before jumping in to implement something. It saves a lot of time. But it won't "teach" you about assembly, you have to already know assembly before you can code it successfully. So to learn assembly you need to code it, i.e. gain experience with it. And that includes (indeed requires) making mistakes and fixing them. You can't possibly write code without bugs. Whoever says that is crazy IMO. But you can identify bugs and fix them. A debugger can help, but so can a simple print function. And here is where planning help a great deal. Once your algorithms and layout are finalised you can concentrate upon debugging the implementation, not the methods. If you are trying to fix both at the same time then it is easy to get lost and frustrated. Separate your tasks and have a clear outline of what you are doing then move on to the next stage and get the implementation working.
Post 24 Apr 2017, 04:15
View user's profile Send private message Visit poster's website Reply with quote
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)


revolution wrote:
One thing I have noticed is that a significant portion of asm coders don't RTFM. And many of them don't even want to download TFM. You won't get far without doing those things.



There have been many other (quite useful, IMO) references over the years. I disagree that you can't do anything useful without the "full" manual(s). Maybe mandatory for OS writers, but userland? Nah.


Quote:

Intel and AMD provide the manuals for free. They are the canonical reference to use. No amount of asking in forums or reading other people's code will substitute for the actual manual.



Some things can't be explained except in action. You can read the Oberon-2 EBNF in a little over one page. The Oberon-07 report is only 17 pages (two of which include the EBNF).

Yet it's not enough. You still won't understand some details because they are either (accidentally or intentionally) omitted. Some things are just not obvious at first glance.

(Of course, you'll probably say, "Obviously, syntax != semantics.")


Quote:

And yes I know it is often considered bad manners to direct people to a manual, but I think this perception is wrong. Asking about things like which flags are altered by a particular instruction is extremely bad manners and waste other people's time to answer such a simple question that can be self-answered immediately by looking in TFM.



It's not at all bad manners to say, "Read Intel doc #123 [here], page 45".

Or, if the answer is so extremely obvious, just politely ignore them and let them search on their own a bit. It's not wrong to ignore people. You aren't expected to help literally everyone all the time. But some things are easier (and more obvious) than others, hence we should try to shed some light since there's only so many hours in the day, and one can't find (or experiment with) literally everything by oneself.

Even the most well-intentioned person can forget or omit to mention certain obvious restrictions or misunderstand what (to some) is "easy". Sometimes things get lost in translation (or regress or change or ...).

BTW, yes, asking for what flags are changed is a bit lazy, but it could also just be accidental forgetfulness. Oops, forgot to check my local docs, was using the wrong computer, too tired, not enough coffee, etc.

Oh, and before I forget (!!), "modern" (AVX, MOVBE, CLMUL) era x64 is hugely overkill. The docs are thousands of pages long. Honestly, it's too much for one person. It might honestly be better to direct people to older documents (e.g. [sic] 286 or 386, see OpenWatcom's FTP). And yes, I realize that I harped quite heavily for the OP to learn x64 (and not IA-32), but trying to decipher 16-bit, 32-bit, 64-bit, PAE, PSE, VT-X, SSE, AVX, etc. is just too much! The full docs have to cover literally everything while in reality very few people (if any!) care about 16-bit *and* 32-bit *and* 64-bit *and* all other extensions. "Just skip the boring parts" is still somewhat harder than it sounds (even with indexed, seekable PDFs).


Quote:

I am an advocate of planning the code and algorithms before jumping in to implement something. It saves a lot of time. But it won't "teach" you about assembly, you have to already know assembly before you can code it successfully. So to learn assembly you need to code it, i.e. gain experience with it. And that includes (indeed requires) making mistakes and fixing them. You can't possibly write code without bugs. Whoever says that is crazy IMO. But you can identify bugs and fix them. A debugger can help, but so can a simple print function. And here is where planning help a great deal. Once your algorithms and layout are finalised you can concentrate upon debugging the implementation, not the methods. If you are trying to fix both at the same time then it is easy to get lost and frustrated. Separate your tasks and have a clear outline of what you are doing then move on to the next stage and get the implementation working.



I don't know, it's all a mess. I don't have good advice. It's hard to tell anybody anything because the computing world is so complicated. But indeed, a lot of code isn't well-tested and certainly isn't meant for teaching. Optimized code is not for the faint of heart. Simplified code, even if slow (or limited), still has a use. But, in the real world, it's somewhat hard to find code that is clear but slow. ("Who wants inefficient code? It's not worth teaching!") Reading one byte at a time (or perhaps something like bubble sort) is heavily frowned upon, but sometimes it's the simplest answer.
Post 24 Apr 2017, 07:59
View user's profile Send private message Visit poster's website Reply with quote
vivik



Joined: 29 Oct 2016
Posts: 190


rugxulo wrote:
What (Windows) debuggers allow immediate instruction execution or even assembling? I'm only familiar with DOS ones, so that's not directly helpful for the OP.



OllyDbg works on windows. Haven't seen a debugger better. Absolutely nondependant on compiler's debug info, opens any exe. Also a bit of a pain to use, but that's what you get for not using advantages of civilization.

It's far from a proper learning and programming enviroment, but that's the best thing I found so far.


revolution wrote:
One thing I have noticed is that a significant portion of asm coders don't RTFM. And many of them don't even want to download TFM.


Three things I hate are: walls of text, PDF, and anything that goes in alphabetical order. And intel manuals have all of them.
Post 24 Apr 2017, 14:54
View user's profile Send private message Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 15324
Location: Bigweld Industries


vivik wrote:
Three things I hate are: walls of text, PDF, and anything that goes in alphabetical order. And intel manuals have all of them.

Well that's what manuals are.

What sort of manual would you prefer to read? Can you suggest a different way or organising it?
Post 24 Apr 2017, 15:18
View user's profile Send private message Visit poster's website Reply with quote
vivik



Joined: 29 Oct 2016
Posts: 190


Quote:

What sort of manual would you prefer to read? Can you suggest a different way or organising it?


Well, then I was reading "Design Patterns: Elements of Reusable Object-Oriented Software", one thing I liked was that they had like 4 different indexes, 4 different ways of showing connections between different patterns. It's visible that they did an additional effort to make their book more easy to understand.

Being easy to understand is a job for tutorials, not for references. The only important thing for a reference is to be correct and to be complete. It's rare for a reference to also be easy to read.

I got a list of mnemonics by their usage from fasm manual.

I got mnemonics by frequency of usage from reading the fasm source code itself.

I got mnemonics by their opcode from here, here and here.

Still hoping to find something like a cookbook, with snippets of code in assembly. I guess I'll just write my own collection manually, or gather one from different forum posts.
Post 25 Apr 2017, 08:11
View user's profile Send private message Reply with quote
shutdownall



Joined: 02 Apr 2010
Posts: 518
Location: Munich

Funny discussion. I think the question is not what is the best language to program as this depends on you desires, environment and many things more. The very first question is in my eyes WHY do you want to learn programming and next WHAT do you want to do or need it for.

There are special languages for special purposes. If it's just to try out what programming is and what you can do and see if you get addicted to it, then I would propose BASIC to start. This is quite easy and fast to learn and has many tutorials and sample code.

If you need programming for a special purpose then you should take the suitable language for the machine where it should run (maybe a high performance computer ?) and see what libraries exist which can be used. Libraries will speed up development of course. Often the library or the environment and available compilers will decide what language can be used.

Debugging is a separate point. I think there is no programmer in the world, who does not develop buggy code. There are attempts to avoid buggy code and put under stable feedback for errors during runtime like Java for example. But in general code can not be checked logically if it really does what you want. Compilable code does not mean, that there are no bugs in the code.

The best way to start is from top to bottom - so higher level languages first to get experience and go deeper to lower languages if you are more experienced. I think experience is the main factor in development and experienced developers will develop code more generally, think of limits, decide which code attempt would be better (often you have many choices to choose a problem) and so on.

So what is assembly about and why do some people develop in assembly ? The question is quite simple. This is the language that the specific machine (cpu) understands and if you want to program an operating system this would be the choice to use. C oder C++ is higher language but lacks some specific instructions needed for hardware control (like in/out instructions). These have often timing considerations (no interrupt allowed in specific code sequences).

The other point is, that C or other high language variants do use the stack extensively and you maybe first have to learn, what a stack is, how it is organized and what to do to change the stack somewhere else or to load code into specific address areas. Here things get complicated and such tasks are often to solve in assembly quite better than in high level languages. But you need a lot of knowledge about the underlying hardware as well.

So what are the main tasks for assembly:
* hardware drivers / hardware layers of an OS
* speed requirements

The main disadvantage from assembly is: It is not easy portable from one machine to another hardware (other cpu or other OS). So you have to keep in mind, that you assembly is not easy to port.

Why do developers program at all in assembly as soon as there are more easy languages for general purpose software ? Because they are so familiar with assembly that it is not much difficult to program any task in that language. An italian guy who can english prefers to talk english in spain. But if he is very familiar in spanish language then he could prefer to talk spanish instead of english.

It is often a matter of choice, skills, environment and task which decides about the language used. And personal preferences of course. Humans have the choice which language to talk, so why not have a choice which language to program ? If you are new and want to solve more general tasks you may use C (like english) as general purpose language. There are people solving tasks in Forth language as well. This is used often in scientific environments but not much useful to write a USB driver. Wink
Post 27 Apr 2017, 13:03
View user's profile Send private message Send e-mail Reply with quote
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)


vivik wrote:
Still hoping to find something like a cookbook, with snippets of code in assembly. I guess I'll just write my own collection manually, or gather one from different forum posts.



Not sure what kind of snippets you mean. You mean like Forth words but coded in assembly? (What about a Forth in assembly?? ciforth??)

Or do you mean like these?

Bit Twiddling Hacks (C, not ASM)
the Assembly Gems page

Or just general tips (like from Agner Fog's manuals)?

Other than that, dunno! Did you have anything in mind specifically?
Post 28 Apr 2017, 00:18
View user's profile Send private message Visit poster's website Reply with quote
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)


shutdownall wrote:
There are special languages for special purposes.

The best way to start is from top to bottom - so higher level languages first to get experience and go deeper to lower languages if you are more experienced.



I stumbled upon this old link. I haven't fully read it (nor associated texts), but it sounds interesting. This C.S.-ish Preface to his (free) book seems pretty clear on what Comp. Sci. is all about. (I don't honestly have the aptitude to fully summarize it here.)

Although it focuses on Scheme, intentionally, instead of a more "traditional" language.

But I don't understand their dislike of Algol/Pascal here, but overall they seem to think those are too low-level and tedious to use. Yet Pascal was always high-level, abstract, great for algorithms and data structures, recursion, etc. (But this was mostly written in 1993 and presumably ignores Extended and Turbo dialects, not to mention forthcoming Delphi.) Even Modula [sic] and Oberon only barely get a mention as "more flexible ... but not flexible enough".

Who knows, perhaps bignums and garbage collection, built-in list primitives, etc. make things easier for some tasks. I don't know the intimate details. But I've seen a Lisp written in (Free)Pascal, so how bad can it be?? Razz
Post 28 Apr 2017, 00:36
View user's profile Send private message Visit poster's website Reply with quote
vivik



Joined: 29 Oct 2016
Posts: 190


rugxulo wrote:
Not sure what kind of snippets you mean.


I meant "ready to copypaste and use code".

Second pack of links is very close, but probably too advanced for me right now. General advices is good too, somebody already showed me this link before.

For example, just to show my current level, I probably wouldn't know the correct way to do the division, if I didn't ask about it. I was using XOR EDX,EDX instead of just CDQ. When I was reading the list of assembler mnemonics, I just skipped all type conversion commands, thinking I wouldn't need them any time soon. If this is mentioned in the intel manuals, I would read them immediately. (Also sorry for asking about flags that time, it wouldn't be repeated.)

About what kind of snippets would be useful for me: I want to be able to do all I was able to do in python, all the data structures from there and more. Lists, dictionaries, utf8 strings, maybe even garbage collection, compression algorithms, crc32 and md5 hashes, jpg and png file formats. I need to learn a lot before I could do something useful with assembly.

(Many of those should be in libraries and not in code snippets though. I could just use C libraries with my code, potentionaly. Yeah, I must find a good C debugger, and play with those libraries.)
Post 28 Apr 2017, 10:51
View user's profile Send private message Reply with quote
vivik



Joined: 29 Oct 2016
Posts: 190


shutdownall wrote:
The best way to start is from top to bottom - so higher level languages first to get experience and go deeper to lower languages if you are more experienced.


I want to add to this, high level languages not always make life easier for you. Sometimes they make things more complex in they own way, by introducing things that are clever and abstract, but not that useful. Things like OOP, design patterns, monads, they often create more problems than they solve. One probably shouldn't go too deep into them, or at least be careful with them.

This is the point that many programmers and job interviewers may not agree with, so one probably should learn this trash anyway.
Post 01 May 2017, 06:53
View user's profile Send private message Reply with quote
Trinitek



Joined: 06 Nov 2011
Posts: 257


vivik wrote:

shutdownall wrote:
The best way to start is from top to bottom - so higher level languages first to get experience and go deeper to lower languages if you are more experienced.


I want to add to this, high level languages not always make life easier for you. Sometimes they make things more complex in they own way, by introducing things that are clever and abstract, but not that useful. Things like OOP, design patterns, monads, they often create more problems than they solve. One probably shouldn't go too deep into them, or at least be careful with them.

This is the point that many programmers and job interviewers may not agree with, so one probably should learn this trash anyway.

Some OOP design patterns are designed to work well with large projects and don't scale down well for a project containing only a handful of files. (i.e. it incurs a lot of boilerplate to maintain than it's worth) Of course job interviewers want you to be aware of what patterns exist and when to apply them, because they want you to be able to jump in to projects of different sizes when they need you to.

But that's design patterns. I'm going to assert that if you think a HLL is making life harder for you, you don't understand how to use the language. Languages don't force a design pattern, the programmer does.
Post 01 May 2017, 22:10
View user's profile Send private message Reply with quote
rugxulo



Joined: 09 Aug 2005
Posts: 2124
Location: Usono (aka, USA)


vivik wrote:

rugxulo wrote:
Not sure what kind of snippets you mean.


I meant "ready to copypaste and use code".



I'm still not sure what kind you're expecting. Ready-to-use routines? For what exactly? (e.g. POPCNT? UpCase? Hex2Dec?) And you'd still have to choose baseline for cpu compatibility (386, 686, etc).


vivek wrote:

For example, just to show my current level, I probably wouldn't know the correct way to do the division, if I didn't ask about it. I was using XOR EDX,EDX instead of just CDQ. When I was reading the list of assembler mnemonics, I just skipped all type conversion commands, thinking I wouldn't need them any time soon. If this is mentioned in the intel manuals, I would read them immediately.



Signed vs. unsigned makes things much more complicated. That's why several languages (classic Pascal, Oberon, Ada83, Java) didn't have unsigned at all. The "CARDINAL" type (unsigned) in Modula-2 was more crucial for 16-bit machines, but it complicated assignment compatibility. Oberon was 32-bit from the start, so it didn't need to worry about arrays/size_t hence just did away with it. (Similarly, AWK and older Lua just used double [float] for everything since it could equally handle quasi 32-bit ints just fine. Of course, later Lua 5.3 added 64-bit ints natively for various reasons.)

So, out of type simplicity, most people just use "large as possible" (signed) integer for everything.


vivek wrote:

(Also sorry for asking about flags that time, it wouldn't be repeated.)



There are other (older, simpler) references that mention flags used (or even pseudocode), like HelpPC or ASM86FAQ.

Nobody honestly understands all of x86 anymore. CISC has never been more complicated.


vivek wrote:

About what kind of snippets would be useful for me: I want to be able to do all I was able to do in python, all the data structures from there and more. Lists, dictionaries, utf8 strings, maybe even garbage collection, compression algorithms, crc32 and md5 hashes, jpg and png file formats. I need to learn a lot before I could do something useful with assembly.



You might be biting off more than you can chew! Just check the assembly output of a suitable compiler (PyPy? GCC? FPC?), and go from there.

There are assembly versions of various things, but it's usually cryptic, doesn't well explain the underlying math / algorithms. (CRC32 and JPEG have been done in DOS assembly, for instance. Of course, FPC has its own native CRC32 / MD5 routines. Compression is best seen in HLLs like C or FPC, but I've seen simpler tools in assembly too, e.g. [DOS] Kaboom.)


vivek wrote:

(Many of those should be in libraries and not in code snippets though. I could just use C libraries with my code, potentionaly. Yeah, I must find a good C debugger, and play with those libraries.)



Check the assembly output, e.g. "gcc -c -S -masm=intel".
Post 02 May 2017, 00:25
View user's profile Send private message Visit poster's website Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  


< Last Thread | Next Thread >

Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum


Powered by phpBB © 2001-2005 phpBB Group.

Main index   Download   Documentation   Examples   Message board
Copyright © 2004-2017, Tomasz Grysztar.