flat assembler
Message board for the users of flat assembler.

Index > OS Construction > rewrite linux in asm with fasm

Goto page Previous  1, 2, 3, 4  Next
Author
Thread Post new topic Reply to topic
TmX



Joined: 02 Mar 2006
Posts: 843
Location: Jakarta, Indonesia
TmX 06 Oct 2010, 04:07
edfed wrote:

but where is the "opensource" philosophy when you need a full day to compile, and don't be sure of the result...


what does open source philosophy has to deal with compilation time?
Post 06 Oct 2010, 04:07
View user's profile Send private message Reply with quote
ManOfSteel



Joined: 02 Feb 2005
Posts: 1154
ManOfSteel 06 Oct 2010, 11:42
rugxulo wrote:
Last I checked, Vista (Home Premium on up, not that joke called Home Basic) and Win7 both need at least 1 GB of RAM and 16 GB of HD to run. OpenSuSE also last I checked "recommended" 1 GB. I know for a fact that XP can run in much much less, but they want to phase that out because "it's old".

With GNU/Linux and *BSD, you can choose what you need and remove the rest. This stands for both applications and system tools as well as the features inside these applications and system tools (and the kernel). This means even the latest release can still run on very old machines *if* you choose wisely. It is a very simple thing to do once you are aware of the available alternatives. And today we have the Internet...

With Windows, you can barely disable a few resource-hungry processes and features, but you would not be able to reduce that to, say, half of what is required by default. The only alternative is to run deprecated, vulnerable and sometimes unstable versions such as Windows 9x or perhaps Windows 2000.

rugxulo wrote:
Even Ubuntu needs gigs and gigs for a full install.

As I previously said, these are marketed as replacements for Windows and therefore have to support a lot of hardware and have eye-candy interfaces to be able to compete. But there is a choice you can make here.

rugxulo wrote:
A review of some *BSD I read (perhaps NetBSD) bragged about how it could run in bare console in "as little as 40 MB" ... doing what??? That's a lot of RAM, but the problem is that nobody knows and nobody cares.

You would be surprised! Even when I am under X, I use many command-line applications. I can even configure a kernel in such a way that the console switches to high VESA modes. I would then be able to run a window manager/terminal multiplexer, with (E)Links, Midnight Commander, MPlayer, etc. on a shell, and I would never run X unless I needed a graphical application (e.g. photo retouching). Or I could simply run these applications on different vttys.

So again, this is a choice you can make. You want a system with point-and-click skinned applications, animated menus, shadows and translucent windows? Get a modern machine. You can live without these superficial "features", care to read a man page and will not cry for help the minute you see the output of dmesg? Very good, you can keep your old machine and still use the latest release.
Just do not try to defy logic.

Also, is using 40MB of RAM a lot? Even a 10 year old machine has 64 or 128MB!

rugxulo wrote:
One guy with an ultra modern rig told me, "Compiling is fast enough, it doesn't need to be faster", ugh. Surely he doesn't compile much then.

Multicore machines with 2GB or more and the right settings can buildworld in less than an hour. Depending on the machine specs and the settings used, I have seen people completing the entire process in 40, 20 or even 12 minutes!
Post 06 Oct 2010, 11:42
View user's profile Send private message Reply with quote
edfed



Joined: 20 Feb 2006
Posts: 4353
Location: Now
edfed 06 Oct 2010, 13:44
TmX wrote:
edfed wrote:

but where is the "opensource" philosophy when you need a full day to compile, and don't be sure of the result...


what does open source philosophy has to deal with compilation time?


hem... if an opensource project takes 1 days to compile, it is quite closed. because it will be very hard to do fast testings in order to apply major modifications. but it needs an open mind to understand this.


even in pure console, linux is bloated.
then, rewrite in asm is only to reduce the memory requirements, increase compilation speed, and filter what is really needed and what is pure waste.

like reactos is a project to provide a compatible windows opensource OS, a linux asm version will let it become a very fast and slim OS.
Post 06 Oct 2010, 13:44
View user's profile Send private message Visit poster's website Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 20451
Location: In your JS exploiting you and your system
revolution 06 Oct 2010, 13:59
Well my experience with Linux has been bad. This was not due to bloat but actually lack of bloat (in a sense). Because there were/are no drivers to support the hardware. Without drivers no OS can run on a system.

Some of the "bloat" you are describing is drivers. But without the drivers people couldn't use the OS. Only a few, of thousands of drivers, is actually needed in any one system, but you can't know beforehand which ones to include. So you have to include them all, even the obscure and weird ones that "nobody" uses.

Any asm version of any OS will also have to contend with the driver problem. If you have only a few drivers then you limit the places the OS can run and/or you limit the functionality you can provide. However, if instead you have every conceivable driver then you will get complaints about having too much bloat. Either way you can't win.
Post 06 Oct 2010, 13:59
View user's profile Send private message Visit poster's website Reply with quote
bitRAKE



Joined: 21 Jul 2003
Posts: 4073
Location: vpcmpistri
bitRAKE 06 Oct 2010, 14:17
With the prevalence of the cloud. All an OS needs is network drivers. Even the hardware detection could mostly be downloaded. The user could even provide some guidance to the process (just to hide the latency Twisted Evil in the mind of the user).
Post 06 Oct 2010, 14:17
View user's profile Send private message Visit poster's website Reply with quote
ManOfSteel



Joined: 02 Feb 2005
Posts: 1154
ManOfSteel 06 Oct 2010, 14:47
edfed wrote:
then, rewrite in asm is only to reduce the memory requirements, increase compilation speed, and filter what is really needed and what is pure waste.

Nothing is pure waste or else people would not, well, waste their time on it.
It all depends on who you are targeting.
Post 06 Oct 2010, 14:47
View user's profile Send private message Reply with quote
Octavio



Joined: 21 Jun 2003
Posts: 366
Location: Spain
Octavio 06 Oct 2010, 16:11
ManOfSteel wrote:

I have done so with relatively small applications such as WMs or curses games and the results are positive.
C compilers can sometimes "miss" things and somehow uglily "optimize" code or compile code in really weird ways that only a human coder can spot and fix manually.

And how much time would you need to translate the full linux?
Coty wrote:
Do you use FASM or GCC?
I use Octasm but i think that the bloat is not a problem of programing language,is just the lack of interest on doing things better.
Rugxulo wrote:

Another guy with quad core 64-bit and tons of RAM (8 GB?) could rebuild Linux with X11 and everything in like 12 hours (Gentoo?).

Hello Rugxulo ,thats about 3$ of electricity.Definitively i should rename octaos to eco_os or green_os. Smile
Post 06 Oct 2010, 16:11
View user's profile Send private message Visit poster's website Reply with quote
TmX



Joined: 02 Mar 2006
Posts: 843
Location: Jakarta, Indonesia
TmX 06 Oct 2010, 17:46
edfed wrote:
hem... if an opensource project takes 1 days to compile, it is quite closed. because it will be very hard to do fast testings in order to apply major modifications. but it needs an open mind to understand this.


OK.... guess you should try a very slim Linux distro, something like Tiny Core Linux. It's very small, and I guess only needs several hours to build it from source.
Post 06 Oct 2010, 17:46
View user's profile Send private message Reply with quote
f0dder



Joined: 19 Feb 2004
Posts: 3175
Location: Denmark
f0dder 14 Oct 2010, 02:19
edfed wrote:
Quote:
what does open source philosophy has to deal with compilation time?
hem... if an opensource project takes 1 days to compile, it is quite closed. because it will be very hard to do fast testings in order to apply major modifications. but it needs an open mind to understand this.
I wonder which project would take that long to compile on a reasonable machine?

My server, a humble 1.6GHz Intel Celeron 420 with 2GB of RAM and a slow primary/OS disk (5400rpm, iirc) running gentoo rarely takes more than a minute when building an app (exceptions are the huge stuff like GCC and the linux kernel) - and that's when upgrading to a newer version, which means re-compiling every file.

While doing development work, you don't recompile every file - using makefiles, or other build environments, combined with a modular source code split means you only need to rebuild the files that have been changed (and you get the additional bonus of a sane, non-spaghetti, comprehensible project organization). Needless to say, this greatly reduces build-time.

Talking about how fast FASM can assembly itself isn't much of a guidance here; while it's quite an amount of work for a single person, it's amount of code is a drop in the water compared to GCC or the linux kernel. Also, the FASM source code afaik doesn't use much of the macro facilities; if you were to "reimplement linux in fasm using macros so it's not an über-impossible task", that would mean different compiling speed (and yes, I'm using the term "compiling" deliberately, since heavy use of macros isn't that far from HLLs).

Apart from the idea of rewriting linux in assembly being extremely silly and impossible to achieve, why do you suggest "putting all the source in a single file"? Ooooh, the maintanability horrors of that approach.

_________________
Image - carpe noctem
Post 14 Oct 2010, 02:19
View user's profile Send private message Visit poster's website Reply with quote
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8359
Location: Kraków, Poland
Tomasz Grysztar 14 Oct 2010, 09:56
f0dder wrote:
why do you suggest "putting all the source in a single file"? Ooooh, the maintanability horrors of that approach.
I'm sure that Betov would disagree instantly (if he was still posting here) - he was promoting the idea of single file source for the exact opposite reason (better maintanability) in the old days. Well, there was something to it - it is the matter of editing facility you use.
Post 14 Oct 2010, 09:56
View user's profile Send private message Visit poster's website Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 20451
Location: In your JS exploiting you and your system
revolution 14 Oct 2010, 09:59
Modularised files in fasm allowed me to write fasmarm pretty easily. Without that I would have had a harder time with it.
Post 14 Oct 2010, 09:59
View user's profile Send private message Visit poster's website Reply with quote
ManOfSteel



Joined: 02 Feb 2005
Posts: 1154
ManOfSteel 14 Oct 2010, 10:25
f0dder wrote:
I wonder which project would take that long to compile on a reasonable machine?

Xorg, GNOME, KDE, OOo, etc. ... which is why I don't use DEs and only use binary packages for things like Xorg.
Post 14 Oct 2010, 10:25
View user's profile Send private message Reply with quote
mindcooler



Joined: 01 Dec 2009
Posts: 423
Location: Västerås, Sweden
mindcooler 14 Oct 2010, 13:53
I compiled Openoffice.org on a Pentium-166. Took a week.
Post 14 Oct 2010, 13:53
View user's profile Send private message Visit poster's website MSN Messenger ICQ Number Reply with quote
f0dder



Joined: 19 Feb 2004
Posts: 3175
Location: Denmark
f0dder 22 Oct 2010, 17:03
ManOfSteel wrote:
f0dder wrote:
I wonder which project would take that long to compile on a reasonable machine?
Xorg, GNOME, KDE, OOo, etc. ... which is why I don't use DEs and only use binary packages for things like Xorg.
Yeah, those are big, but in fairness they're not single applications but suites - but I did write projects, and point taken Smile

I still stand by my point that while you're doing dev, you don't "make clean" all the time, it's an iterative process - you only recompile modified parts, which should only be very few files per build... unless, of course, your architecture and source file physical structure is a tangled spaghetti mess Smile

Tomasz Grysztar wrote:
f0dder wrote:
why do you suggest "putting all the source in a single file"? Ooooh, the maintanability horrors of that approach.
I'm sure that Betov would disagree instantly (if he was still posting here) - he was promoting the idea of single file source for the exact opposite reason (better maintanability) in the old days. Well, there was something to it - it is the matter of editing facility you use.

Betov... has some pretty peculiar ideas. I partially agree with him, so far as that having a proper IDE with an "understanding" of your project can do a lot of really nice things that help with productivity and maintainability.

I don't agree with his idea of single-file projects, though. While I like IDEs, I also like having the ability to edit my source code without firing up a (possibly big and bulky) IDE. Splitting into multiple files means you can use your filesystem hierarchy for logical grouping, which is IMHO a very good ting. It also makes it easier to do code generation as part of your build process, linking to code produces in other languages, and facilitating code re-use through libraries.

Of course Betov would argue that you should use DLLs if you want reusability, and that if you want to use existing libraries written in HORRIBLE HORRIBLE languages, you should disassemble their code and include the assembly dump... if that works for him, fine, but I like development tools that don't limit my choices Smile

By the way, while I think the C/C++ preprocessor is a powerful thing (and not something that should be boo-hissed at, when applied properly), I've never been a fan of the header/source split done in C languages... I came from pascal, and it's "unit" split and "using" directive was one of the few things I missed when moving to C and C++.

_________________
Image - carpe noctem
Post 22 Oct 2010, 17:03
View user's profile Send private message Visit poster's website Reply with quote
edfed



Joined: 20 Feb 2006
Posts: 4353
Location: Now
edfed 23 Oct 2010, 14:02
what i can say after one week of use of linux is:

linux is real shit.

it is really bloated, really dumb, but as it is a religion now, nothing can be done with it.

as i can see, i cannot install gimp, blender, dosbox, and anything without the internet, and many hours of upgrades.

i cannot compile my codes because of the upcase/lowcase file names disticntions.

i cannot use linux to code.
it is not made for this.
and what i feel really bad is the shell interpreter.

command line under linux is like a big joke.

what i don't understand is:
why does linux have so success? is it really because it is open source? or just because some brainfucker made it when there was only MSdos?

i strongly believe that any OS project from this board is better than the first linux. and if efforts are concentrated as on linux dev, i am pretty sure that it can kill linux (and windows). because linux is really a crap full of unbalanced libs, packages, etc... something i cannot dare anymore.

if i code only in asm, it is because i cannot dare bloat. and linux, it is exactlly the opposite of asm.
Post 23 Oct 2010, 14:02
View user's profile Send private message Visit poster's website Reply with quote
revolution
When all else fails, read the source


Joined: 24 Aug 2004
Posts: 20451
Location: In your JS exploiting you and your system
revolution 23 Oct 2010, 14:14
edfed wrote:
why does linux have so success? is it really because it is open source?
I think it is because it is free and open source. Everybody likes to get stuff for free, of course. And since it is open source people can modify it if they have both the time and inclination.

edfed, what you seem to have experienced is the problem with having no central control. Linux is basically anarchy. There are a myriad of packages, perhaps one of them fits your needs perfectly. But finding that perfect one is a huge problem.
Post 23 Oct 2010, 14:14
View user's profile Send private message Visit poster's website Reply with quote
ManOfSteel



Joined: 02 Feb 2005
Posts: 1154
ManOfSteel 23 Oct 2010, 17:44
edfed wrote:
what i can say after one week of use of linux is:

linux is real shit.

it is really bloated, really dumb, but as it is a religion now, nothing can be done with it.

Not that I am a big fan of GNU/Linux distros but aren't you exaggerating a bit? Maybe you tried a particular distro that has a bloated desktop environment and many huge default applications. There are lightweight minimalist distros, Arch Linux being one of them if I am not mistaken, as well as fairly complete but still lightweight distros. Check sleepsleep's posts for the latter.

edfed wrote:
as i can see, i cannot install gimp, blender, dosbox, and anything without the internet, and many hours of upgrades.

Where do you get your Windows software, eh? Software retailers?

edfed wrote:
i cannot compile my codes because of the upcase/lowcase file names disticntions.

How is case sensitivity impairing your ability to assemble code? Personally I have no problem with it so I don't understand.

edfed wrote:
i cannot use linux to code.
it is not made for this.

Oh, please! There are members of this very forum who code under GNU/Linux. The same tools are available (by default or not) on all *nix systems and I can tell you from personal experience they can be used efficiently for software development.
Heck, most open-source projects are developed on *nix systems and for them first and then ported to Windows.

edfed wrote:
and what i feel really bad is the shell interpreter.

command line under linux is like a big joke.

No, really, this is way too much. *nix shell scripting is far superior to any other. These systems can be controlled and managed entirely using a shell. As for the interactive part, most shells are way more flexible and powerful than Windows' command-line interpretor: permanent history, history editing, command/filename autocompletion, aliases, keybinding, etc.
If you dislike the terminal emulator, you can install another one. As for the shell itself, it may be possible to change under GNU/Linux, but I am not sure.

edfed wrote:
why does linux have so success?

Does it? As far as I know, its popularity (and therefore success) in the desktop industry is quite insignificant.


~~~~


revolution wrote:
edfed, what you seem to have experienced is the problem with having no central control. Linux is basically anarchy. There are a myriad of packages

This is ridiculous and unfair, really. There are only a few GNU/Linux base distributions: Debian, Fedora, Gentoo, Mandriva, Red Hat, SUSE and Ubuntu. All distros are based on one base distribution or another. Distros are nothing more than the Linux kernel bundled with a group of default applications packaged and configured in a certain way by the people who created the particular distro.
Yes, there is a "myriad of packages" just like there are hundreds of thousands of applications on http://download.cnet.com/windows/ or similar websites. If Windows had a license with a free-distribution clause, virtually millions of Windows distros could be made using Windows and third-party applications.

revolution wrote:
perhaps one of them fits your needs perfectly. But finding that perfect one is a huge problem.

How is this different from Windows or any other system? Don't you still have to try all the alternatives and pick the one that "fits your needs perfectly". The only difference is that people who use computers for the first time under Windows automatically have sets of applications shoved down their throats by the manufacturer, retailer, or computer-literate family members or friends. Ever wondered why IE still has almost half of the PC browser market share despite its infamous history as a slow and horribly buggy and insecure application?
Post 23 Oct 2010, 17:44
View user's profile Send private message Reply with quote
Tyler



Joined: 19 Nov 2009
Posts: 1216
Location: NC, USA
Tyler 23 Oct 2010, 20:46
This may be my fault, partially. The way I told edfed to access the cmdline to execute commands I gave him in PMs was through the F2+alt. edfed, there are many other ways. You can have a real terminal if you hit crtl+alt+F1(hit crtl+alt+F7 to get back to GUI).

Quote:

i cannot install gimp, blender, dosbox

I'm assuming you can't find them... The part about not being able to without internet is kinda pointless for 2 reasons, Windows is the same, and you're complaining about it on the internet.
Code:
sudo apt-get install gimp blender dosbox
    

It's that easy. Or even easier to search for them in Synaptic.

How can you say Linux is bloated compared to Windows? What version of Windows were you using, '98? It's true that Ubuntu is big, but it's for usability, just as the size of current Windows is all for something.
Post 23 Oct 2010, 20:46
View user's profile Send private message Reply with quote
bitshifter



Joined: 04 Dec 2007
Posts: 796
Location: Massachusetts, USA
bitshifter 23 Oct 2010, 21:33
I have been studying Linux 1.0 and Minix 3.1 sources for a while now.
Then i put Minix on a disk and booted it...
Coming from a MS-DOS background, i was unable to do anything with it.
It seems i need to read up on the shells scripting language.
But from reading the sources i can say that Minix has some very good
ideas on how an OS should be written.
There is a bit of bloat even in this tiny version for portability sake.
I only use Intel boards, and most likely will do so for the rest of my days.
(Or at least until someone forgets about backward compatability and starts fresh)
I strongly do beleive that an x86 kernel should be written in x86 ASM.
Go ahead and use C for the shell and everything else though.
Also AT&T (GCC) inline assembly really grosses me out.
Inline assembly in general is bad news, no two compilers are the same.
Just my thoughts...
Post 23 Oct 2010, 21:33
View user's profile Send private message Reply with quote
Fanael



Joined: 03 Jul 2009
Posts: 168
Fanael 24 Oct 2010, 09:12
ManOfSteel wrote:
No, really, this is way too much. *nix shell scripting is far superior to any other.
No, really, this is way too much. Have you ever heard about PowerShell?
Post 24 Oct 2010, 09:12
View user's profile Send private message Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  
Goto page Previous  1, 2, 3, 4  Next

< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum


Copyright © 1999-2025, Tomasz Grysztar. Also on GitHub, YouTube.

Website powered by rwasa.