flat assembler
Message board for the users of flat assembler.

Index > Heap > [content deleted]

Goto page Previous  1, 2, 3, 4  Next
Author
Thread Post new topic Reply to topic
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 17270
Location: In your JS exploiting you and your system
revolution
RAM should be RWM to compare properly with ROM. So how ya gonna say RWM? Razz
Post 25 Aug 2009, 15:02
View user's profile Send private message Visit poster's website Reply with quote
f0dder



Joined: 19 Feb 2004
Posts: 3170
Location: Denmark
f0dder
True, asm isn't an acronym, but it's a "somewhat funny to pronounce" abbreviation Smile

revolution: "rawm", perhaps? Almost pronounceable Smile

Languages are messy, anyway. In everyday Danish, some terms are in English, some in Danish, some are "Danishified", some words are kept in English but with Danish pronunciation, some acronyms are spelled out whereas others are pronounced.

Sometimes these quirks help conversation flow more fluidly, other times... well... please don't say "observer m√łnstret", stick to "observer pattern" please.
Post 25 Aug 2009, 15:43
View user's profile Send private message Visit poster's website Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 17270
Location: In your JS exploiting you and your system
revolution
Maybe we should take the "W" name from English - double-U, and pronounce it RUUM. Almost becomes ROOM. So then we have ROOM and ROM. Welsh uses "W" as a vowel, and it also has a short "OO" sound.
Post 25 Aug 2009, 16:08
View user's profile Send private message Visit poster's website Reply with quote
Borsuc



Joined: 29 Dec 2005
Posts: 2466
Location: Bucharest, Romania
Borsuc
By the way when I said hexadecimal, I meant that you can represent 0-9/A-F with 4 bits of course, not with ASCII.
Post 25 Aug 2009, 16:22
View user's profile Send private message Reply with quote
Azu



Joined: 16 Dec 2008
Posts: 1160
Azu
Borsuc wrote:
@shoorick: I can't read that, latin or non-latin. Since most people use english, and since english is simple (in the number of letters I mean) without "special" lettersr requiring more bytes, it is the most logical choice.

Of course, if people learned hexadecimal that would be more logical, and simpler, but we aren't there yet. However we ARE closer with english (or other languages with simple characters) but we're stubborn.

It's like wanting to move from binary to decimal. Definitely this isn't for simplicity's sake.

Not to mention, english is the "father" language of technical details, and no wonder, since computers (and computer terms) were invented in english (USA). Ever heard how a french pronounces DVD? It's retarded, he speaks as if it's a french word, and not an acronym (acronyms don't have 'dialects', 'different meanings' or other crap like that).

Azu wrote:
What if you want to hard-code some of the strings (e.g. an error page or something) for the website? Wouldn't it be much easier if the compiler supported it instead of you having to write them in as bytecode?
Import a file.

well some compilers/languages are retards and don't allow it like Fasm so... their fault.
A file made in.. an editor that supports unicode. Oops. Looks like you still need an editor that supports unicode to do it then.


Personally, I find it simpler to use one program than two.
Post 26 Aug 2009, 00:31
View user's profile Send private message Send e-mail AIM Address Yahoo Messenger MSN Messenger ICQ Number Reply with quote
windwakr



Joined: 30 Jun 2004
Posts: 827
Location: Michigan, USA
windwakr
f0dder wrote:
I beg to differ; have you ever met anybody who pronounces RAM as R-A-M, native English speaker or not? Do you pronounce ASM as A-S-M?


How DO people pronounce ASM? I say Azim. Like FASM, I pronounce Fazim, is that "right"? Is there a "right" way to pronounce it?

_________________
----> * <---- My star, won HERE
Post 26 Aug 2009, 00:36
View user's profile Send private message Reply with quote
Azu



Joined: 16 Dec 2008
Posts: 1160
Azu
windwakr wrote:
f0dder wrote:
I beg to differ; have you ever met anybody who pronounces RAM as R-A-M, native English speaker or not? Do you pronounce ASM as A-S-M?


How DO people pronounce ASM? I say Azim. Like FASM, I pronounce Fazim, is that "right"? Is there a "right" way to pronounce it?
I prefer "ef ei es em".
Post 26 Aug 2009, 00:37
View user's profile Send private message Send e-mail AIM Address Yahoo Messenger MSN Messenger ICQ Number Reply with quote
f0dder



Joined: 19 Feb 2004
Posts: 3170
Location: Denmark
f0dder
windwakr wrote:
f0dder wrote:
I beg to differ; have you ever met anybody who pronounces RAM as R-A-M, native English speaker or not? Do you pronounce ASM as A-S-M?
How DO people pronounce ASM? I say Azim. Like FASM, I pronounce Fazim, is that "right"? Is there a "right" way to pronounce it?
About the same here, although I usually say "assembly" instead of "asm". Dunno if there's a right way, but iirc Tomasz himself doesn't spell it out Smile

_________________
Image - carpe noctem
Post 26 Aug 2009, 05:11
View user's profile Send private message Visit poster's website Reply with quote
sinsi



Joined: 10 Aug 2007
Posts: 693
Location: Adelaide
sinsi
Hmmm...

asm=a s m
fasm=fazzum
masm=mazzum
jwasm=jay wazzum Smile

It can be important, as I discovered in the '90s with C++
"Huh? See double plus? Oh <snicker> do you mean see plus plus?" Embarassed
Post 26 Aug 2009, 05:19
View user's profile Send private message Reply with quote
shoorick



Joined: 25 Feb 2005
Posts: 1605
Location: Ukraine
shoorick
Quote:

@shoorick: I can't read that, latin or non-latin.

then you look like man who gives advises in the matter what he does not understand

Quote:

Since most people use english

except through internet, i do not know at least one who use english

Quote:

"special" lettersr requiring more bytes, it is the most logical choice

i've been using 7-bit symbols (with cyrillic instead of small latin - have you ever think these small latin letters are absolutely useless?) on the pc with 36 kb system ram and 2MHz 8-bit CPU. Now i have pc with 504000 kb system ram and 3000MHz 32-bit CPU - why should i worry about 16-bit symbols? why should i give up my national customs because of mechanisms? me exist for computer or computer exists for me? seems, you will never understand this, as well as i will never agree with you on this question.


Description: try to find the symbol table in the rom bitmap ;)
Filesize: 12.76 KB
Viewed: 3247 Time(s)

koi7.png



_________________
UNICODE forever!
Post 26 Aug 2009, 05:46
View user's profile Send private message Visit poster's website Reply with quote
Borsuc



Joined: 29 Dec 2005
Posts: 2466
Location: Bucharest, Romania
Borsuc
shoorick wrote:
except through internet, i do not know at least one who use english
FASM forum isn't on the internet?
shoorick wrote:
why should i give up my national customs because of mechanisms?
Cars are fast enough these days, why should we give up on our child dreams of using square wheels since they'll work anyway? (just an example mind you Very Happy)

Computer advance is supposed to make us humans follow them in simplicity. We should advance with them, not leech off them.

As for national customs, I didn't say you had to give up on them. But your national customs don't include computers definitely. (english in computers isn't a "national custom", it just happens to be because computers were invented in USA).

I don't think any traditional culture, or language, for example, had computers in mind, so it's not like you're changing it or give up on it or anything.

_________________
Previously known as The_Grey_Beast
Post 26 Aug 2009, 15:16
View user's profile Send private message Reply with quote
Borsuc



Joined: 29 Dec 2005
Posts: 2466
Location: Bucharest, Romania
Borsuc
Here's more technical approach to see my point perhaps better.

You do know that in many areas there are standards that get broken, and then two standards or much more get adopted, and it turns out a complete mess right, and then people in Camp A complain you have to accept their standard as well, but then Camp B complains, etc?

If both would follow the damn initial standard none of this mess would have happened, drivers would be easily written for all sort of things (the USB implementation is an utter mess), etc... KISS -- Keep it Simple Stupid.

This is what happens when people want to use their own "stuff" instead of agreeing to use only ONE. It is a pain in the ass mostly for people who have to write operating systems/software to deal with all these.

I think this doesn't only go against simplicity, it goes into a complete and utter mess, chaos if you will. It's not that computers are not capable of this, since they are perfectly capable of holding millions of standards -- it is the simplicity, or rather LACK of it, and the pain in the ass that results from it. It is the humans who are at fault here, not the computer's capabilities, because humans use different standards. Online "translators" wouldn't even be needed otherwise -- they are a result of software needing to support all standards, and I bet they were a pain in the ass to whoever had to make them (or who didn't have them!).

In this case, a standard would be "a language". Since english is the first one to be used in computers and is the most popular regarding computers, also the one used in technical documents, I think it makes it obvious it should be used as "the standard".

Then when people want to communicate over technical matters, like programming, they wouldn't need to go into a PITA mode and learn all the standards, but just use 1 standard, clean and simple, not a total chaos.


Again I'm only talking technically-wise here, that is computers, not "culture" or anything like that -- unless your tradition includes computers, but I doubt Wink

_________________
Previously known as The_Grey_Beast
Post 26 Aug 2009, 17:46
View user's profile Send private message Reply with quote
Azu



Joined: 16 Dec 2008
Posts: 1160
Azu
If you just want what is technically most small/efficient for computers, ISO-8859-1 definitely isn't it.

At the very least (simplest to implement while still being way smaller than currently) would be a UTF8-like system that worked on words and punctuations instead of individual characters. This is because many of the possible combinations of characters don't exist in real words.
Post 26 Aug 2009, 18:27
View user's profile Send private message Send e-mail AIM Address Yahoo Messenger MSN Messenger ICQ Number Reply with quote
Borsuc



Joined: 29 Dec 2005
Posts: 2466
Location: Bucharest, Romania
Borsuc
That's more complex because the software has to be aware of words and punctuations instead of just "another character". This is increased complexity.

_________________
Previously known as The_Grey_Beast
Post 26 Aug 2009, 18:37
View user's profile Send private message Reply with quote
Azu



Joined: 16 Dec 2008
Posts: 1160
Azu
Borsuc wrote:
That's more complex because the software has to be aware of words and punctuations instead of just "another character". This is increased complexity.
The encoding of text itself would be a lot simpler and smaller. As for whether programs could process it faster.. that depends on the implementation.
Post 26 Aug 2009, 18:41
View user's profile Send private message Send e-mail AIM Address Yahoo Messenger MSN Messenger ICQ Number Reply with quote
Borsuc



Joined: 29 Dec 2005
Posts: 2466
Location: Bucharest, Romania
Borsuc
Smaller or not it would still be complex. Compression is also very small sometimes but doesn't mean it's not complex.

I don't know about UTF-8 much though, so take my opinion with a grain of salt. Smile

_________________
Previously known as The_Grey_Beast
Post 26 Aug 2009, 18:42
View user's profile Send private message Reply with quote
Azu



Joined: 16 Dec 2008
Posts: 1160
Azu
Borsuc wrote:
Smaller or not it would still be complex. Compression is also very small sometimes but doesn't mean it's not complex.

I don't know about UTF-8 much though, so take my opinion with a grain of salt. Smile
Most frequently used = first 7 bits
If bit 8 is set, process the next byte also
Repeat until bit 8 isn't set



Or something like that..
Post 26 Aug 2009, 18:45
View user's profile Send private message Send e-mail AIM Address Yahoo Messenger MSN Messenger ICQ Number Reply with quote
shoorick



Joined: 25 Feb 2005
Posts: 1605
Location: Ukraine
shoorick
Oh, Borsuc-Borsuc! all your proof is based on the supposing: if you have not met a problem, then it definetly does not exists, thus it became a stupid discussion. i wish you never use 16-bit chars!
++++++++++++++++++++++

could not stay not posting this. nice report, isn't it?


Description:
Filesize: 3.98 KB
Viewed: 3164 Time(s)

report.png


Post 27 Aug 2009, 05:00
View user's profile Send private message Visit poster's website Reply with quote
Borsuc



Joined: 29 Dec 2005
Posts: 2466
Location: Bucharest, Romania
Borsuc
shoorick, like I said before, let's go to the "multiple standards" example (for simplicity, not languages!). if you start to support broken standards, they will keep popping up because... well they are supported.

The only way to make people just use 1 standard is to not support the others, eventually they will realize.

Can you imagine what kind of chaos it would be if the Windows API had 5 different standards depending on Windows version/language/whatever? Can you imagine the anti-simplicity there found for programs having to support all of them?

(the USB implementation and other devices already are in pure chaos!).

It's not the computer's capabilities here at fault, since a computer can even run multiple operating systems these days. It's the human's fault. He's the one setting multiple standards. And it goes against simplicity.

What's simpler? Everyone having to learn both Linux and Windows or just 1 of them? What's simpler? Everyone having to program for both Linux and Windows or just 1 of them?

Why not extend this to the huge number of languages in the world? It's a pain in the ass for both programmers AND users who use it. It would be far easier and simpler if everyone used only one standard.


Here's a "user" example:
I don't know about you, but when I get my hands on a keyboard with different keys like AZERTY or whatever they use in Dutch-speaking countries, I type like 1 letter per second instead of 5-9 per second like I usually do.

It's a pain in the ass. I repeat, not just for the manufacturers of the keyboards, but for users as well. If everyone used 1 type of keyboard, 1 language when discussing technical computer-wise matters, we wouldn't need translators, software to deal with multiple languages, throw simplicity away and waste bytes (parsing ASCII is also easier than Unicode).


This is how all mess and chaos begins. People want to use "different" ways to do something (in this analogy, communicate), and it only makes a mess out of it, confuses others who aren't aware and a pain in the ass for everyone (programmers & users).

SCENARIO 1 (utter mess):
Guy 1: "Hey we established the USB standard specification"
Guy 2: "Nah, I like to use method xyz better."
Guy 3: "You both suck. My implementation is the best!"

Programmers: "Alright we implemented the standard specification, but then people complained other devices didn't work. We found out that they use Guy 2's specs, so we had to include that too. Then other people complained that even more didn't work -- turns out it used Guy 3's specification. For freak's sake couldn't they just use 1 single specification (language)?"

This is utter chaos and mess, for both users who complain and have to learn the specs (probably), hardware manufacturers and programmers.


SCENARIO 2 (the PROPER way, even with flawed humans):
Programmers: "Alright we implemented the standard specification, but then people complained other devices didn't work. We told them we refuse to implement broken standards, and they should complain to the device manufacturer for using a different standard. This is the only way to keep things in order and not end up in pure chaos."

Razz
Post 27 Aug 2009, 14:23
View user's profile Send private message Reply with quote
Azu



Joined: 16 Dec 2008
Posts: 1160
Azu
Borsuc wrote:
shoorick, like I said before, let's go to the "multiple standards" example (for simplicity, not languages!). if you start to support broken standards, they will keep popping up because... well they are supported.

The only way to make people just use 1 standard is to not support the others, eventually they will realize.

Can you imagine what kind of chaos it would be if the Windows API had 5 different standards depending on Windows version/language/whatever? Can you imagine the anti-simplicity there found for programs having to support all of them?

(the USB implementation and other devices already are in pure chaos!).

It's not the computer's capabilities here at fault, since a computer can even run multiple operating systems these days. It's the human's fault. He's the one setting multiple standards. And it goes against simplicity.

What's simpler? Everyone having to learn both Linux and Windows or just 1 of them? What's simpler? Everyone having to program for both Linux and Windows or just 1 of them?

Why not extend this to the huge number of languages in the world? It's a pain in the ass for both programmers AND users who use it. It would be far easier and simpler if everyone used only one standard.


Here's a "user" example:
I don't know about you, but when I get my hands on a keyboard with different keys like AZERTY or whatever they use in Dutch-speaking countries, I type like 1 letter per second instead of 5-9 per second like I usually do.

It's a pain in the ass. I repeat, not just for the manufacturers of the keyboards, but for users as well. If everyone used 1 type of keyboard, 1 language when discussing technical computer-wise matters, we wouldn't need translators, software to deal with multiple languages, throw simplicity away and waste bytes (parsing ASCII is also easier than Unicode).


This is how all mess and chaos begins. People want to use "different" ways to do something (in this analogy, communicate), and it only makes a mess out of it, confuses others who aren't aware and a pain in the ass for everyone (programmers & users).

SCENARIO 1 (utter mess):
Guy 1: "Hey we established the USB standard specification"
Guy 2: "Nah, I like to use method xyz better."
Guy 3: "You both suck. My implementation is the best!"

Programmers: "Alright we implemented the standard specification, but then people complained other devices didn't work. We found out that they use Guy 2's specs, so we had to include that too. Then other people complained that even more didn't work -- turns out it used Guy 3's specification. For freak's sake couldn't they just use 1 single specification (language)?"

This is utter chaos and mess, for both users who complain and have to learn the specs (probably), hardware manufacturers and programmers.


SCENARIO 2 (the PROPER way, even with flawed humans):
Programmers: "Alright we implemented the standard specification, but then people complained other devices didn't work. We told them we refuse to implement broken standards, and they should complain to the device manufacturer for using a different standard. This is the only way to keep things in order and not end up in pure chaos."

Razz
I agree; Windows should just die and make way already. Everything related to computers would be so much better then.
Post 27 Aug 2009, 14:28
View user's profile Send private message Send e-mail AIM Address Yahoo Messenger MSN Messenger ICQ Number Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  
Goto page Previous  1, 2, 3, 4  Next

< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You can attach files in this forum
You can download files in this forum


Copyright © 1999-2020, Tomasz Grysztar.

Powered by rwasa.