flat assembler
Message board for the users of flat assembler.

Index > Heap > reverse the way we code software / application

Author
Thread Post new topic Reply to topic
sleepsleep



Joined: 05 Oct 2006
Posts: 9002
Location: ˛                             ⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣⁣Posts: 334455
sleepsleep
after i saw this slashdot article,

http://developers.slashdot.org/story/13/07/24/1756254/ingy-dt-net-tells-how-acmeism-bridges-gaps-in-the-software-world-video

http://acmeism.org/projects/

http://en.wikipedia.org/wiki/YAML

then i thought,
most of the time, we confined ourselves to a language, and through its limitations & features to produce our output,

even if we start by generate some final results about what we want, we still put a full stop around that language border, by follow the syntax of a chosen language, architecture and framework,

but what if we could go pass through language, database, architecture, framework? by re-diff-constructing them (maybe design different wheel)

it should be, dealing with code in assembly level is the highest pleasure one could get.
Post 24 Jul 2013, 23:11
View user's profile Send private message Reply with quote
bitRAKE



Joined: 21 Jul 2003
Posts: 2940
Location: vpcmipstrm
bitRAKE
These discussions are always interesting and entertaining!
Quote:
try{ ...
}
catch(roadrunner)

Always seems to fail for some reason, though.
Laughing People are so afraid of using low-level. zOMG, it will all go wrong - SECURITY!!!! Which is funny because the result is still a complex system full of holes. Granted, there is a certain level of skill required to find the holes - people can't just start angst coding in the wild and let the social dynamics sort themselves out.

...or can they?

It's all academic because some minimal level of defensive posturing seems to always be used. Sometimes more for it's political value rather than actual security it provides. Fear is worth points.


But I digress, again?

Isn't something lost every time we pass between abstraction layers? Doesn't matter if we are talking about software, (natural) language, drawing, carving, music, DNA, or introspective cognitive analysis -- each of these abstractions has an expressive power. Which often escapes definition. Yet, we can probably all agree it exists. Just like you exist separate from everyone else - there is a reason for diversity.

...or is there?

If we write a piece of software to formally analyze language specifications, and it is found that language B is a subset of language A. Why don't we just get rid of language B? How do we go about getting rid of these inferior languages?

Marginalization is the easiest method. It can be done systematically. First we re-define B in the culture - both with translators and with negative adjectives. Then we stop teaching B, or creating new tools for B. Eventually the only way to get to B is through other tools - virtually a second class language at this point.

Hm...this is no different than the natural evolution of the software ecosystem. Therefore, all we need to do is create "better" languages and it will happen in time. Isn't diversity great!

_________________
¯\(°_o)/¯ unlicense.org
Post 25 Jul 2013, 06:10
View user's profile Send private message Visit poster's website Reply with quote
dogman



Joined: 18 Jul 2013
Posts: 114
dogman
sleepsleep wrote:
after i saw this slashdot article, then i thought, most of the time, we confined ourselves to a language, and through its limitations & features to produce our output,

even if we start by generate some final results about what we want, we still put a full stop around that language border, by follow the syntax of a chosen language, architecture and framework,


This is why understanding the history of computing and especially the history of language design is important. It gives you a much better window to look through and then you select your tools based on what's best for the job rather than choosing the mass market popularity contest winner. When you can that is, when you're working for other people which is mostly how we work, we can't always do that. But at least we will know right from wrong. Everything has limitations. Part of the job of implementing things optimally is to be able to select and apply the tools in the most harmonious way possible. This is often not happening in the business world though. Still, it helps to know the difference.

sleepsleep wrote:
but what if we could go pass through language, database, architecture, framework? by re-diff-constructing them (maybe design different wheel)


I think the time for being able to do that is long gone. Things are too big and complicated for any one person to understand anymore and MBAs don't spend money for R&D, even at places where R&D is the business . They think they're smarter buying stuff that's already done, because R&D is hard and expensive. Removing the funding from innovation destroys the best stuff before it ever germinates. They're penny wise and pound foolish more than they'll ever know.

If you start your own company you have a chance to make a better wheel or something that's even better than a wheel. If not, you start small and make your own building blocks and abstractions for the way you think the problem should be solved. There are always many constraints but you can often arrange things in a useful way. In some jobs you can do that more than others. But in most cases you can make your own foundation tools and build things the way you think they should be built.

sleepsleep wrote:
it should be, dealing with code in assembly level is the highest pleasure one could get.


That's not effective for the most commonly-needed class of IT apps like GUIs and e-business stuff. There is still an ecosystem where virtually 100% of the code is written in assembler but things are going very badly like in the rest of the economy and tech world. I don't know how long it will last. Elsewhere I mostly don't see the case for assembly language except in very few situations and only in specific of jobs like new OS or driver design, some corner of the embedded world, and possibly in compiler technology. But these are all very small niches and most of those guys probably don't do everything in assembly. For most people assembly coding is going to have to be a hobby but it is essential to save it from becoming a lost art. Because on the day the last assembler coder goes away the world is going to become a very different place. On that day, the computers will have won because there will be nobody left who really understands anything at all.
Post 25 Jul 2013, 07:23
View user's profile Send private message Reply with quote
dogman



Joined: 18 Jul 2013
Posts: 114
dogman
bitRAKE wrote:
These discussions are always interesting and entertaining!


It's great to find a forum where people care deeply about programming.

bitRAKE wrote:
Isn't something lost every time we pass between abstraction layers?


Yes! Complexity!

There is a difference between good abstraction and bad abstraction. And there is a difference in the motivations that produce them. The vast majority of abstractions in current OS and software today are about mitigating the fact most coders are unqualified and incompetent, and they are also about an incorrect view of how things should be designed. It really exposes the fact that most of what people use was not designed or engineered properly at all. Now most people are faced with dealing with countless layers of APIs and middleware that really should never have been written at all. If the OS would have been designed to be useful, directly useful, then all the abstractions between it and the programmer would not have been necessary.

OTOH there are the abstractions that the programmer makes to create domain specific or platform specific building blocks to solve the problem he wants to solve. Other peoples' abstractions are much less useful (and sometimes harmful) then ones you design. People see things differently, so the OS and its services should be exposed to the developer and the developer should cloak them from the end user. To look at another problem from the opposite point of view, consider how many applications improperly expose OS or middleware-centric views. The end user doesn't care and should not have to understand how the OS or middleware or other layers work. He wants to do x, y, and z with a mininum of aggravation. Don't present information to him like the OS wants to present it, present it as he is interested in seeing it. This is one of the biggest problems in modern applications.

bitRAKE wrote:
If we write a piece of software to formally analyze language specifications, and it is found that language B is a subset of language A. Why don't we just get rid of language B? How do we go about getting rid of these inferior languages?

Marginalization is the easiest method. It can be done systematically. First we re-define B in the culture - both with translators and with negative adjectives. Then we stop teaching B, or creating new tools for B. Eventually the only way to get to B is through other tools - virtually a second class language at this point.

Hm...this is no different than the natural evolution of the software ecosystem. Therefore, all we need to do is create "better" languages and it will happen in time. Isn't diversity great!


I don't think it's that easy because of what I wrote above. Aside from the social engineering you described the problem remains the current common OS and middleware need to be replaced entirely. You can't make complex things simple. You can only strive to make something simple and keep it simple. Once it becomes complex there is no going back.
Post 25 Jul 2013, 07:32
View user's profile Send private message Reply with quote
TmX



Joined: 02 Mar 2006
Posts: 822
Location: Jakarta, Indonesia
TmX
sleepsleep wrote:

it should be, dealing with code in assembly level is the highest pleasure one could get.


I'm not really sure that assembly is the suitable tool for typical business/non system-related apps.

What I'd like to have is a some sort of formalized development system, that is once the software is built, you are confident that the it runs correctly per specification, and it can give you mathematical reasoning why you software is correct.

Is that too much? Very Happy
Post 25 Jul 2013, 07:57
View user's profile Send private message Reply with quote
dogman



Joined: 18 Jul 2013
Posts: 114
dogman
TmX wrote:
What I'd like to have is a some sort of formalized development system, that is once the software is built, you are confident that the it runs correctly per specification, and it can give you mathematical reasoning why you software is correct.

Is that too much? Very Happy


Yes, it is. People have given up on formal proof-of-correctness for general use.

There are tools for safety-critical applications but they're not widely used and not widely-useful. Aside from the practical impossibility of proof of correctness for applications, the cost/benefit makes it a total no-go.

However...

Correct software engineering practices and certain technologies can address many problems before they become problems. Again, the expense is unacceptable in today's MBA climate and it is certainly of questionable value in 99.9% of application systems. Where it is necessary (medical, air traffic control) then people use stuff like Ada where if the program even compiles it has ruled out many possible sources of problems and should do what you expect.

Everywhere else, nobody is willing to pull their heads out of the sand and realize the cost of fixing broken systems far exceeds not designing and implementing broken systems in the first place. No MBA is going to sign off on up-front expenses to do things properly. They can always get a budget to fix stuff after it breaks because a crisis is the best time to get a budget approved. When your system is down you'll pay any ransom to get it up again.

Correctness and good code is not a corporate priority, not even for software companies. I was around when it was but those days are long gone. The only thing that matters to them now is what kind of deals they can sign and what the financials will look like next quarter. Then they just take a job at a new company and do it all over again.


Last edited by dogman on 25 Jul 2013, 11:31; edited 3 times in total
Post 25 Jul 2013, 08:22
View user's profile Send private message Reply with quote
bitRAKE



Joined: 21 Jul 2003
Posts: 2940
Location: vpcmipstrm
bitRAKE
dogman wrote:
I don't think it's that easy because of what I wrote above. Aside from the social engineering you described the problem remains the current common OS and middleware need to be replaced entirely. You can't make complex things simple. You can only strive to make something simple and keep it simple. Once it becomes complex there is no going back.
I agree with the user separation from OS internal structure. Designwise Apple took this top down approach (maybe, not far enough), but it didn't seem to have the effect of removing internal abstractions. I've always thought we'd layer on enough abstractions and then someone would come in and chop the middle out. Years ago I said MS should write a minimal OS for DotNet, and that would be the new Windows. Hasn't really happened that way.

Simple doesn't stay simple. So, are you suggesting a periodic ethnic cleansing?

_________________
¯\(°_o)/¯ unlicense.org
Post 25 Jul 2013, 08:45
View user's profile Send private message Visit poster's website Reply with quote
dogman



Joined: 18 Jul 2013
Posts: 114
dogman
bitRAKE wrote:
Simple doesn't stay simple. So, are you suggesting a periodic ethnic cleansing?


In the business world that's never going to happen except perhaps in the way that I expect appliances and then appliance-OS will become increasingly common as hardware gets cheaper and cheaper and all businesses want is an ecommerce appliance cough webserver and php mysql stack cough and a BI server, etc. There's no reason to drag around the weight of a whole OS to do that kind of stuff. In that way I think things will become simplified but that really leaves general purpose OS on shaky ground, and there hasn't been anything revolutionary in OS development in the last 30 years. There are still a few that are probably good enough but they're too expensive for common use and not "modern" enough to make the transition to appliance-OS.

The other thing is research. That's where a lot of ideas came from that eventually turned into useful implementations. But since the money for research is mostly gone that channel is closed. It's hard to know what to do or where to turn. Writing an OS is a lot different from engineering an OS. There are only two engineered OS left and they're both in bad shape for the future. The rest seems to be Linux-everywhere and that isn't good either.

I have no answers but I have some pretty good ideas what the problems are. I guess that's better than nothing. Bottom line is always the fact business has to need something badly enough to fund it. That's where the rubber meets the road. If nobody will pay for something it doesn't matter how good it is, and if somebody will pay for something it doesn't matter how bad it is. People just never seem to understand that throwing money at broken stuff is just putting on bandaid after bandaid. It may address the symptoms but it never fixes the root cause.

_________________
Sources? Ahahaha! We don't need no stinkin' sources!
Post 25 Jul 2013, 09:37
View user's profile Send private message Reply with quote
bitRAKE



Joined: 21 Jul 2003
Posts: 2940
Location: vpcmipstrm
bitRAKE
People want large state spaces even on their lightweight clients. So, it's a bit of a farce to tout the appliance-OS. The multi-button purchasing machine paradyn is quite limited - that's a very shallow hole. We do see the market fragmenting, and I can see the appliance-OS from that perspective.

General purpose doesn't go away though. It'll wane as people move to their appliances. The cloud doesn't replace the home command center. No more than people want to give up their car for AI vehicles. It's too costly to produce an infrastructure of comfort sufficient for the culture to make the migration any time soon.

The bane of marketing is our enemy - corporations have poisoned their well by dumbing down consumers and believing their own hype to drive sales. Combine that with a general lack in corporate trust. Hasn't their also been a process of moving research in-house at many companies?

The language drives the changes in culture. Superior tools will win or be emulated. Slowing down the technological cycle benefits corporations, imho. But they can't halt it.
Post 25 Jul 2013, 10:39
View user's profile Send private message Visit poster's website Reply with quote
dogman



Joined: 18 Jul 2013
Posts: 114
dogman
It's happening already. The number of desktops goes down and down dramatically every year. People want entertainment appliances at home or in their hand, webserver e-commerce and database appliances at work. Anything that eliminates the cost of people and management is going to succeed in the market in the long term. The inhibitor now is engineered appliances cost too much for small and medium businesses. That will change and the general purpose OS will slowly disappear. There is absolutely no reason to run a full Linux on a phone or tablet when all you need to do is make calls and watch movies and play games. Linux is free and portable so it's widely deployed. Android proves a mostly Java-based OS can serve the phone and entertainment appliance sector and other modifications will continue slimming down non-essential stuff and increase performance for the tasks people are actually willing to pay for.

"Hasn't their also been a process of moving research in-house at many companies?"

As far as I can tell, the answer is a resounding "No!" People are outsourcing more than ever. Anything that doesn't smell like a manager is not something they want to deal with. The MBA's idealized corporation is an org chart of nothing but managers who buy services from a smorgasboard. Nobody wants to spend a penny if they can avoid it even to make money. They can't think past the end of this quarter. Do you know how many companies pump up the financials with waves of hiring and firing? IBM just fired 10,000 people. A few years ago they got to the point they have more people working outside America than in it for the first time in their history. This is the biggest corporation in America, and most of its employees aren't Americans and don't live in America. What does that tell you about the economy and the tech world? Almost every major company is laying off people by the thousands. Apple doesn't actually make anything, it's all subcontracted to Foxcon who has a million Chinese guys on the payroll (barely). Nobody wants to actually work for their money any more, or pay for research costs. They just keep eating up smaller companies and startups instead of doing it themselves. People are cattle to them. They don't even care what kind of spots they have.

"Superior tools will win or be emulated"

This has never ever been true except in a tiny number of cases, certainly it was never true in general. The very few superior tools that have seen the light of day were never emulated because they were too expensive and too well-designed to be cloned by sweatshops full of idiots. Most of the best stuff didn't win and will not sell, especially not in this generation where new and cheap equal good and old or expensive equal bad. That's the only context anymore. When ethics and morality cease the distinction between good and bad becomes irrelevant and cheap takes the place of good.
Post 25 Jul 2013, 11:21
View user's profile Send private message Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  


< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You can attach files in this forum
You can download files in this forum


Copyright © 1999-2020, Tomasz Grysztar. Also on YouTube, Twitter.

Website powered by rwasa.