flat assembler
Message board for the users of flat assembler.

Index > Programming Language Design > Plain English Programming

Goto page Previous  1, 2, 3, 4, 5  Next
Author
Thread Post new topic Reply to topic
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 20 Apr 2015, 20:59
nyrtzi wrote:
When someone speaks of programming in natural language all I hear is Cobol and SQL.

Hopefully, something like Plain English will one day pop into mind as well, having achieved equally wide reception in the marketplace. Smile

nyrtzi wrote:
I like the concepts behind relational databases but SQL although being useful is a poor implementation of those ideas and one of its faults is trying to look like natural language.

I agree that SQL is a badly conceived and implemented language. It is also experiences an "impedance mismatch" with most programmers who naturally think at a row-at-a-time level (while SQL insists that they deal will entire sets of rows, all the time).

nyrtzi wrote:
Who was Cobol written for? For people who code or those who don't?

COBOL stands for "COmmon Business-Oriented Language" and was thus intended for those involved in businesses (both programmers and non-programmers); it was an attempt to get the user, the programmer, and the machine all speaking the same language.

nyrtzi wrote:
In the end it's all about communicating meaning, intent, etc.

Exactly. And when everyone speaks the same language, there is less possibility of miscommunication.

nyrtzi wrote:
As far as I'm concerned programming language might as well try becoming more math-like.

I think that's a mistake. Most of most programs are not mathematical in nature. Less than 2% of our IDE, for example, is mathematical (see my post on page 2 of this thread for the details). Most of it is statements like:

Copy the field into another field.
Append the fragment to the current routine's fragments.
Abort with "I was hoping for a definition but all I found was " then the token.
Initialize the compiler.
Remove any trailing backslashes from the path name.
Reduce the monikette's type to a type for utility use.
Eliminate duplicate nicknames from the type's fields.
Prepend "original " to the term's name.
Extend the name with the rider's token.
Unquote the other string.
Read the source file's path into the source file's buffer.
Generate the literal's name.
Extract a file name from the compiler's abort path.
Write the compiler's exe to the compiler's exe path.
Swap the monikettes with the other monikettes.
Skip any leading noise in the substring.
Scrub the utility index.
Fill the compiler's exe with the null byte given the compiler's exe size.
Position the rider's token on the rider's source.
Pluralize the type's plural name.
Link.
Finalize the compiler.
Check for invalid optional info on the type.


I really don't see how thoughts like those can be more easily or more clearly expressed in mathematical syntax.

nyrtzi wrote:
I'm not against natural language as far as easier readability goes but does programming in plain old english allow for better readability?

I think so, yes. When I get away from programming for a while, or when I'm looking at a routine I haven't seen in some time, I've found Plain English to be both the easiest to "get back into" and the easiest to "figure out".

nyrtzi wrote:
Programming languages aiming at being more like english usually seem to just end up being verbose. If the parser skips what it considers redundant filler words and concentrates on some subset of the language it considers meaningful then programmers will still have to learn what that subset is in order to understand and to be able to be sure about what they've written actually means and how it is understood by the mechanisms which implement what the text says.

In Plain English, most of the vocabulary and grammar for a given application is defined by the programmer himself as he creates new types, variables, and routines. So it's not so much a matter of the programmer learning the compiler's local dialect, but the compiler more-or-less automatically learning the programmer's dialect.

nyrtzi wrote:
Natural language is fuzzy and inexact unless you are prepared to be verbose enough to explicitly spell out the whole darn context in enough detail to make everything absolutely exact which is what the machine executing the program requires.

In practice, it's neither as difficult nor as verbose as you're imagining. Master mathematician Stephen Wolfram had the same kind of misgivings when he first looked into the matter, but he was quickly convinced otherwise (see http://blog.wolfram.com/2010/11/16/programming-with-natural-language-is-actually-going-to-work/ ).

nyrtzi wrote:
But if so and you need to learn the underlying rules of "simplified programming english" anyway then why not simply do what other languages do and make the programmer learn a simplified artificial language which doesn't bother trying to hide context?

First of all, because there's a big difference between an unfamiliar artificial syntax and a subset of everyday English. I don't really have to learn anything new to reduce the obscure "ensconce the blossoms in the receptacle" to "put the flowers in the vase." But I do need to learn something new to reduce that to "vase:=flowers" or "flowers -> vase" or "mov vase,[flowers]". Check out comparative examples of programming tasks in different languages ( http://rosettacode.org/wiki/Category:Programming_Tasks ) and see if you're not struck by (a) the arbitrary and contradictory nature of the examples, and (b) how those examples that are closest to English are the easiest to understand. Secondly, remember that in Plain English most of the vocabulary and grammar is defined by the programmer (as mentioned above).

nyrtzi wrote:
I just fail to see the benefits of programming in plain english.

Try it for a week or two and see how you feel then. Assuming you are adept in a variety of artificial programming languages, there is no doubt some "unlearning" to be done before you can give an unbiased review. After all, there's more truth than we like to admit in the old adage, "User friendly is what the user is used to."

nyrtzi wrote:
Why not chinese instead?

For native Chinese speakers, that's exactly the idea. We already have people working on Plain Spanish and Plain Portuguese, and we've got feelers out regarding Plain French and Plain Russian. Wouldn't we all like to talk to our computers in our own native tongues? Apple's SIRI, Microsoft's CORTANA, Wolfram's ALPHA, and Amazon's ECHO are all indications that we do. The difference between our approach and theirs is that we think the system should be "turtles" (ie, natural language) all the way down.
Post 20 Apr 2015, 20:59
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
codestar



Joined: 25 Dec 2014
Posts: 254
codestar 20 Apr 2015, 21:33
Quote:
"Put, in the drawer, the socks" is not the way we normally speak English
I know. I'm just accustomed to seeing the destiny (l-value) on the left - a=b - as in most programming languages. I wonder if there's another way of saying it:
Code:
set r1 to count
assign r1 to count
load a byte from p ; FASM1+G    
My macro languages are far from perfect, I've been working on them for years in my freetime, and they are within the limitations of FASM1's capabilities. I only provide them for fun and educational purposes. I never finished the AND/OR conjunctions. That excerpt from SpriteFight (in my old Z language) is probably the most extreme example.

Overall, I think Plain English is good and would like to implement a similar language in FASMG when I get time=money.

PS: Example games written in Abakis:
Code:
; BINARY MASTER

WINDOW.W=360
WINDOW.H=492

include 'a.inc'

text t(256), title.t='Binary Master: %hh'

text help.t=+\
 'Fun Game for Programmers.' RET\
 'Click BITs. Count in binary.' RET\
 'Match the decimal number' RET\
 'in the red box to make rows' RET\
 'disappear. Press any key.' RET\
 'r=Reset. p=Pause. Esc=exit'

text pause.t=+\
 'Paused. Press p to continue' RET\
 'or r=Reset. Esc=exit'

text game.over.t=+\
 'Game over. Score: %hh.' RET\
 'Press any key'

align

integer scene, score
numeric SCENE.*, TITLE, PLAY,\
 PAUSE, GAME.OVER
numeric EASY=5000, NORMAL=4000, HARD=2000

BOX board, my.box
integer my.n, red.n, magic.n=10101101b
integer cbit.x, cbit.y, bits.h
numeric BIT.*, W=32, H=48

text my.numbers(8+4), red.numbers(8+4)

FONT main.font='font'

IMAGE bits.i='bits', bit1.i='1',\
 bit0.i='0', close.i='x'

;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;

function random.byte
  locals n
  .r:
  random 3
  if r0<2
    random 16
  else.if r0=2
    random 128
  else.if r0=3
    random 255
  end
  . n=r0
  text.find red.numbers, n
  if true, go .r, end
  . r0=n
  if false, r0++, end
endf

function reset.game
  locals n, p
  . score=0, bits.h=1
  memory.zero my.numbers, 12
  memory.zero red.numbers, 12
  . n=8, p=red.numbers
  loop n
    random.byte
    . r1=p, *r1++=r0, p=r1
  endl
  set.box board, 4, 70, BIT.W*8, BIT.H*8
  . scene=SCENE.TITLE
endf

function on.create
  set.font main.font
  set.timer NORMAL
  reset.game
endf

function remove.byte, t, i
  locals n
  alias p=r0, q=r1, x=r2
  if i=7, go .new, end
  . p=t, p+i, q=p, q++, x=7, x-i, n=x
  loop n, *p++=*q++, endl
  .new:
  . p=my.numbers, *(p+7)=0
  random.byte
  . q=red.numbers, *(q+7)=r0
endf

function remove.row, i
  remove.byte my.numbers, i
  remove.byte red.numbers, i
  . bits.h--
  if bits.h<1, bits.h=1, end
endf

function check.numbers
  locals i
  . i=0
  while i<8, r0=my.numbers, r0+i
    . r1=*r0, r0=red.numbers
    . r0+i, r2=*r0
    if r1=r2, score+r1
      remove.row i
      return 1
    end
    . i++
  endw
endf 0

function draw.board
  locals i, n, x, y, w, h
  draw.image bits.i, 4, 35
  draw.image bits.i, 4, 457
   . x=0, y=0, w=32, h=48
  while y<8, x=0
    while x<8
      . r0=x, r0*w, r0+board.x
      . r1=y, r1*h, r1+board.y
      set.box my.box, r0, r1, w, h
      draw.box my.box, BLACK, GRAY
      . x++
    endw
    . r0=x, r0*w, r0+board.x
    . r1=y, r1*h, r1+board.y
    set.box my.box, r0, r1, 48, h
    draw.box.o my.box, WHITE
    . my.box.x+48
    draw.box.o my.box, RED
    . r0=y, r1=8, r1-bits.h
    if r0>=r1
      . r0=my.numbers, r1=y, r2=8
      . r2-bits.h, r1-r2, r0+r1
      . r1=*r0, my.n=r1
      . r0=red.numbers, r1=y, r2=8
      . r2-bits.h, r1-r2, r0+r1
      . r1=*r0, red.n=r1
      u2t my.n, t
      . my.box.x-40, my.box.y+11
      draw.text t, my.box.x, my.box.y
      . my.box.x+44
      u2t red.n, t
      draw.text t, my.box.x, my.box.y
    end
    . y++
  endw
endf

function draw.bit, n, x, y
  if n
    draw.image bit1.i, x, y
  else
    draw.image bit0.i, x, y
  end
endf

function draw.byte, n, x, y
  locals i
  . i=8
  loop i, r0=n, r1=i, r1--, r0>>cl, r0&1
    draw.bit r0, x, y
    . x+BIT.W
  endl
endf

function draw.my.numbers
  locals i, n, y
  . i=bits.h, y=404
  loop i, r0=my.numbers, r0+i, r0--
    . r0=*r0, n=r0
    draw.byte n, 4, y
    . y-BIT.H
  endl
endf

function draw.title.scene
  draw.text help.t, 16, 130
  draw.byte magic.n, 50, 300
endf

function draw.play.scene
  draw.board
  draw.my.numbers
endf

function draw.pause.scene
  draw.text pause.t, 16, 130
  draw.byte magic.n, 50, 300
endf

function draw.game.over
  print t, game.over.t, score
  draw.text t, 44, 170
  draw.byte magic.n, 50, 300
endf

function on.draw
  locals x, y, w, h
  clear.screen BLACK
  print t, title.t, score
  draw.text t, 4, 4
  draw.image close.i, 324, 4
  . r0=screen.w, r0--
  . r1=screen.h, r1--
  draw.outline 0, 0, r0, r1, GRAY
  if scene=SCENE.TITLE
    draw.title.scene
  else.if scene=SCENE.PLAY
    draw.play.scene
  else.if scene=SCENE.PAUSE
    draw.pause.scene
  else.if scene=SCENE.GAME.OVER
    draw.game.over
  end
endf

function on.key
  if key.event='c'
    if scene=SCENE.TITLE
      . scene=SCENE.PLAY
      go .draw
    end
    if scene=SCENE.GAME.OVER
      go .reset
    end
    if key='r'
      .reset:
      reset.game
      go .draw
    end
    if key='p'
      .pause:
      if scene=SCENE.PLAY
        . scene=SCENE.PAUSE
      else.if scene=SCENE.PAUSE
        . scene=SCENE.PLAY
      end
      go .draw
    end
    .draw:
    render
  end
endf

function on.mouse
  if.select board
    . r0=mouse.x, r0-WINDOW.X, r0-board.x
    . r1=BIT.W, r0/r1, cbit.x=r0
    . r0=mouse.y, r0-WINDOW.Y, r0-board.y
    . r1=BIT.H, r0/r1, cbit.y=r0
    if mouse.event='c'
      . r0=cbit.y, r1=8, r1-bits.h
      if r0>=r1, r0=my.numbers, r1=cbit.y
        . r2=8, r2-bits.h, r1-r2, r0+r1
        . r3=*r0, r2=1, r1=7, r1-cbit.x
        . r2<<cl, r3><r2, *r0=r3
      end
    end
  end
  if mouse.event='r'
    check.numbers
    go .draw
  end
  if mouse.event='c'
    . r0=&close.i.x
    if.select r0
      exit
    end
    if scene<>SCENE.PLAY
      reset.game
      . scene=SCENE.PLAY
    end
    .draw:
    render
  end
endf

function on.timer
  if scene<>SCENE.PLAY
    return
  end
  if mouse.1, return, end
  if bits.h<8, bits.h++
  else
    . scene=SCENE.GAME.OVER
  end
  render
endf

function on.exit
  ; ...
endf    
Post 20 Apr 2015, 21:33
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 21 Apr 2015, 00:00
codestar wrote:
Overall, I think Plain English is good and would like to implement a similar language in FASMG when I get time=money.

The fact that assemblers exist is an admission that we find it inconvenient to code at the machine-code level. And the fact that macros and macro languages exist is an admission that we'd all rather be coding at a higher level still.

It's apparent that some people think that higher level should be mathematical in nature (think, for example, of the way sums and integrals are described in standard mathematical notation). And I'm sure there's a place for that kind of "higher level" language. (Remember, in the end, I'm a proponent of hybrid languages -- snippets of specialized syntax in a natural language framework, like a typical math book -- not just Plain English). But I believe that most of the things that most programs do can be most easily and most clearly described in everyday language: clear the screen, delete the row, move this from here to there, etc.

Judging from your examples above, it appears you're thinking along the same hybrid lines. For example, macro code like:

Code:
set.font main.font
set.timer NORMAL
reset.game
endf    

approaches the Plain English:

Set the font to the main font.
Set the timer to normal.
Reset the game.


while code like:

Code:
set.box board, 4, 70, BIT.W*8, BIT.H*8    

employs some "snippets" of mathematical notation (like BIT.W*8) within a generally higher-level statement.

An interesting facet of the whole question is the naming of variables. Mathematicians typically assign letter names to objects of interest ("...where w is the width of the box, and h is the height"). Note that in everyday speech we rarely give names to the objects around us, but refer to them with an article and a type description ("the chair" or "a table" or "some groceries"). Plain English, of course, makes use of this latter naming convention:

To enumerate from a number to another number:
Put the number into a count.
Loop.
Write the count on the console.
Add 1 to the count.
if the count is less than the other number, repeat.

where indefinite articles (a, an, another, some) indicate parameters ("a number" and "another number") and local variables (defined on the fly, like "a count"), while the definite article (the) indicates a global variable definition (not shown above) or a reference to a parameter or previously-defined variable (like "the count" and "the number" and "the other number").

Now Plain English currently includes hybrid code in the form of machine code "snippets", like this:

To put a byte into eax: Intel $8B9D080000000FB603.

Where the variable-naming issue doesn't arise (since "the byte" is referenced in the machine code via an offset from the stack pointer). But consider a hypothetical hybrid case like this, which is closer to some of your code:

To convert a fahrenheit temp to a celsius temp: c=(f-32)*5/9

Here the compiler (or macro language) needs to associate the math-syntax variables "c" and "f" with the natural language variables "fahrenheit temp" and "celsius temp," which in this case is possible since the leading characters are unique. If they weren't, we'd need some explanatory statements, like:

Let f be the fahrenheit temp.

Or perhaps,

...where f is the fahrenheit temp.

Of course we could compromise using standard Plain English "nicknames" for the variables (which drop the type, in this case "temp"):

To convert a fahrenheit temp to a celsius temp: celsius=(fahrenheit-32)*5/9

But it strikes me that at some point defining the relationships between the various synonymous names may end up being more effort than it's worth. After all, is it really so bad to have to say, once and only once when the function is being defined:

To convert a fahrenheit temp to a celsius temp:
Put the fahrenheit temp into the celsius temp.
Subtract 32 from the celsius temp.
Multiply the celsius temp by 5.
Divide the celsius temp by 9.


That, after all, is the procedure for the conversion. Or, more succinctly:

To convert a fahrenheit temp to a celsius temp:
Put the fahrenheit minus 32 times 5/9 into the celsius.


Bottom line: At some point we decide to trade clarity (for all) in favor of brevity (which is understood only by some). Which is why we think some kind of hybrid will probably the the "language of choice" in the not-too-distant future. And why we think starting at the top is the thing to do: after all, it's much easier to add an inline assembler to Plain English than it is to add a Plain English framework to an existing assembler.
Post 21 Apr 2015, 00:00
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 22 Apr 2015, 06:31
Gerry Rzeppa wrote:
I agree that SQL is a badly conceived and implemented language


They should have stuck closer to relational theory for better results but instead went for optimization and now we're stuck with the results.

Gerry Rzeppa wrote:
with most programmers who naturally think at a row-at-a-time level


"User friendly is what the user is used to"

I don't think there is anything inherently natural in the way mainstream programmers have been nurtured to think in.
Post 22 Apr 2015, 06:31
View user's profile Send private message Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 22 Apr 2015, 07:04
Gerry Rzeppa wrote:
I think that's a mistake. Most of most programs are not mathematical in nature.


I should probably first point out that when I speak of math in the context of programming I'm talking of it in a sense broad enough to include logic as well. Programming is about building systems out of numbers in a systematic way even if the numbers have been hidden by abstractions. So perhaps talking about math and even choosing the word "math" was a mistake as it often leads people to think of something else than what I meant. The keywords here at least for me are "systematic", "well-defined" and "unambiguous" and math is often seen as an example of something which has those properties.

Now if a compiler which turns plain natural language to machine code or some kind of intermediate code achieves those properties then I don't see a problem with it. If it doesn't then I won't be able to take it seriously as a suggested tool for programming.

The thing I first look for in a programming language is understanding so that when I type my code in I understand exactly what it means to the machine and is supposed to do. I don't mind if there is some black box abstraction involved to give things like portability but I need to be able to rely on the programming model provided by the language and libraries so that I don't need to do any guesswork as far as what my code means and does.

Even if most programs are not mathematical in the traditional sense they probably could benefit from the programmer having a more math-like mindset in which you strictly define everything so that if anything goes for example out of bounds you will know about it. Often I see people just writing that "the type of this variable is int" and paying zero attention to any concerns like if the actual value should only be allowed to be between 1 and 100.

I guess nothing prevents from combining this more math-like way of thinking and structuring things with expressing the ideas in plain natural language.

Gerry Rzeppa wrote:
Wolfram had the same kind of misgivings when he first looked into the matter, but he was quickly convinced otherwise


I don't see him saying that the idea is without its share of difficulties. He suggests the programmer being shown the intermediate code synthesized from the natural language description as a way to make it clear exactly what the synthesized code does. This suggestion is pretty much what my demand for well-definedness and unambiguity is also all about. Programming in plain english seems like a nice idea in terms of allowing people to use a syntax they're more used to (I'm not going to call it natural because the word "natural" in itself is so problematic) but it really needs support from the tools to make sure that the ambiguity issue is properly solved. As long as that problem is solved and there is some sort of easy to read definition of the rules used by the parser so that I can learn the used dialect of the natural language in question I don't see any further big issues with plain english or plain chinese programming. After all Pascal uses a lot of plain english keywords nothing prevents from replacing the operators and other pseudo-mathematical parts of the syntax with something more accessible to people with less knowledge about artificial languages.

This whole idea kind of reminds me of Forth in terms of simplicity but then again Forth goes the other way as far as readability goes in terms of what people are used to from natural language.
Post 22 Apr 2015, 07:04
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 22 Apr 2015, 07:30
nyrtzi wrote:
They should have stuck closer to relational theory for better results but instead went for optimization and now we're stuck with the results.

Agreed. Back in the day (when Oracle only had six employees!) I developed a database design course that they licensed from me ( www.era-sql.com ). At the time, the INGRES people had a much better language, and General Motors had a "relational algebra" level language that was much easier to understand the SQL's "calculus". You'd simply apply relational operations one at a time, producing named intermediate sets which served as input to further operations. Like:

X=select all employees in michigan.
Y=select all employees who are female.
Z=X intersect Y.
Show Z.


The best book on the subject that I've ever found, however, is Mikhail Gilula's concise "Set Model for Database and Information Systems" ( http://www.amazon.com/dp/0201593793 ) where he improves on both the model and the language. A little technical in the presentation, but the ideas are golden.

We later extended Gilula's work to include graphical elements in a "pagebase" that we've used for more than twenty years to manage our own internal data (instruction manual attached below). Our system treats wysiwyg pages as rows and folders as tables; everyone sees the same thing: the user sees pages, the programmer sees pages, the computer sees pages. Nothing but pages.


Description:
Download
Filename: PERSPECTIVE.pdf
Filesize: 540.01 KB
Downloaded: 1488 Time(s)

Post 22 Apr 2015, 07:30
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 22 Apr 2015, 08:16
nyrtzi wrote:
Programming is about building systems out of numbers in a systematic way even if the numbers have been hidden by abstractions.

I would have said that "programming is telling the computer what you want it to do." But perhaps one of us is simply looking at the problem from the bottom, while the other is looking from the top.

nyrtzi wrote:
The keywords here at least for me are "systematic", "well-defined" and "unambiguous" and math is often seen as an example of something which has those properties.

Three questions we had in mind when we started our Plain English project were:

1. Is it easier to program when you don’t have to translate your natural-language thoughts into an alternate syntax?

2. Can natural languages be parsed in a relatively “sloppy” manner (as humans apparently parse them) and still provide a stable enough environment for productive programming?

3. Can low-level programs (like compilers) be conveniently and efficiently written in high level languages (like English)?

The answer to all three questions, we found -- somewhat to our own surprise, and much to the surprise of others -- is yes.

But think about it a minute. Humans have been communicating in sloppy, ever-changing languages for millennia and it has worked out reasonably well. I say something to someone else, and if I get the desired result, I'm satisfied; if I don't get the correct response, I try something else. That's what programming in Plain English is like: it relies more heavily on testing than traditional programming, yet turns out (at least in our experience) to be every bit as efficient and significantly more convenient.

nyrtzi wrote:
Now if a compiler which turns plain natural language to machine code or some kind of intermediate code achieves those properties then I don't see a problem with it. If it doesn't then I won't be able to take it seriously as a suggested tool for programming.

We chose to write our entire system (interface, file manager, text editor, hex dumper, native-code-generating compiler/linker, and wysiwyg page layout facility for documentation) in Plain English to see if the idea was sound enough to endure the creation of a significant (rather than a mere toy) application. It worked. Two of us, (working side-by-side on a single computer with two monitors, one mousing and one typing) coded the whole thing in six months; about 25,000 Plain English sentences that can re-compile themselves (in less than three seconds on a bottom-of-the-line computer) into a stand-alone executable (less than a megabyte in size).

nyrtzi wrote:
The thing I first look for in a programming language is understanding so that when I type my code in I understand exactly what it means to the machine and is supposed to do. I don't mind if there is some black box abstraction involved to give things like portability but I need to be able to rely on the programming model provided by the language and libraries so that I don't need to do any guesswork as far as what my code means and does.

The programmer who understands how our compiler works will generally be able to know how the machine will interpret his statements; the programmer who does not will have to fall back on the more human-like "did it do what I wanted it to do?" method described above.

nyrtzi wrote:
Even if most programs are not mathematical in the traditional sense they probably could benefit from the programmer having a more math-like mindset in which you strictly define everything so that if anything goes for example out of bounds you will know about it. Often I see people just writing that "the type of this variable is int" and paying zero attention to any concerns like if the actual value should only be allowed to be between 1 and 100.

I agree that diligence and attention to detail produce better programs; if that's what you mean by "a more math-like mindset" we can leave it there. But I've found that a "mathematical mindset" (in the traditional sense) can often be as much of a hindrance as a benefit. Like the old story about Henry Ford who asked an engineer to calculate the volume of an oddly-shaped fuel tank, and who worked long and hard at his computations while Henry filled the thing with water and emptied it into a graduated cylinder. I think the best programmers use both sides of their brains.

nyrtzi wrote:
I guess nothing prevents from combining this more math-like way of thinking and structuring things with expressing the ideas in plain natural language.

As I've mentioned elsewhere in this thread, it appears we all lean toward hybrid modes of communication: whether it's the genius who writes a technical paper filled with natural language, formulas, and graphics, or the man on the street who speaks in everyday language, "shop talk", and various gestures. Different strokes, not just for different folks, but for different kinds of problems.

nyrtzi wrote:
I don't see [Wolfram] saying that the idea is without its share of difficulties. He suggests the programmer being shown the intermediate code synthesized from the natural language description as a way to make it clear exactly what the synthesized code does.

Yes, but keep in mind that Wolfram, math master that he is, is strongly biased. I put up the link to his article to illustrate how even such a mathematically-biased individual can begin to see the world the way "the rest of us" do.

nyrtzi wrote:
This suggestion is pretty much what my demand for well-definedness and unambiguity is also all about. Programming in plain english seems like a nice idea in terms of allowing people to use a syntax they're more used to (I'm not going to call it natural because the word "natural" in itself is so problematic) but it really needs support from the tools to make sure that the ambiguity issue is properly solved. As long as that problem is solved and there is some sort of easy to read definition of the rules used by the parser so that I can learn the used dialect of the natural language in question I don't see any further big issues with plain english or plain chinese programming. After all Pascal uses a lot of plain english keywords nothing prevents from replacing the operators and other pseudo-mathematical parts of the syntax with something more accessible to people with less knowledge about artificial languages.

The final answers, I believe, will be at least these: some will study the parsing algorithms to learn how to write more certain and precise code; others will restrict themselves to simple and obvious sentences (as they would with small children); others yet will fall back on the testing method (did it do what I said?); still others will be comfortable only with confirmation in the form of a more precise (and more artificial) intermediate language; etc. I would expect that, over time, the compiler(s) will get better at discerning the user/programmer's intent as well.

nyrtzi wrote:
This whole idea kind of reminds me of Forth in terms of simplicity but then again Forth goes the other way as far as readability goes in terms of what people are used to from natural language.

You got that right. We were actually trying to develop a readable derivative of FORTH (called FIFTH, as you might expect) when we got into Plain English. We kept working on the FORTH syntax, trying to make it more readable, and one day said to ourselves, "Well, what exactly do we wish we could say there?" And Plain English was born!
Post 22 Apr 2015, 08:16
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 22 Apr 2015, 16:22
Gerry Rzeppa wrote:
The best book on the subject that I've ever found, however, is Mikhail Gilula's concise "Set Model for Database and Information Systems" ( http://www.amazon.com/dp/0201593793 ) where he improves on both the model and the language. A little technical in the presentation, but the ideas are golden.


Sounds like something I might want to add to my bookshelf. I already have some books from Darwen and Date but there should be nothing wrong with expanding my conceptual horizons.

Gerry Rzeppa wrote:
We later extended Gilula's work to include graphical elements in a "pagebase" that we've used for more than twenty years to manage our own internal data (instruction manual attached below). Our system treats wysiwyg pages as rows and folders as tables; everyone sees the same thing: the user sees pages, the programmer sees pages, the computer sees pages. Nothing but pages.


Sounds kind of similar to some of the stuff I've been playing around with in the context of wikis but then again all I have is toy examples.
Post 22 Apr 2015, 16:22
View user's profile Send private message Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 22 Apr 2015, 20:40
Gerry Rzeppa wrote:
But perhaps one of us is simply looking at the problem from the bottom, while the other is looking from the top.


Yep, seems so. It seems that in general I have no beef with what you've been saying. There are so many different ways of looking and talking about the same things.

Having looked at and thought more about the subject it seems that the only issues left in my mind are: 1) how many rules do I need to remember in order to be able to predict the behavior of the compiler 2) what are the limits for modifying those rules.

How is the language best described? Procedural and imperative? Basically just a layer on top of machine language to keep the whole as simple as possible? This is what it has sounded like at least to me so far.

Gerry Rzeppa wrote:
I agree that diligence and attention to detail produce better programs; if that's what you mean by "a more math-like mindset" we can leave it there.


Yes, that's the main point. Making sure that you can describe exactly how you want your program to behave in a way which makes it as obvious as possible and not only easy to test but so that the compiler can test a lot of the stuff for you.

I used to be a big fan of languages which postpone almost all of their typechecking to runtime but as the years have gone by I've become more and more convinced that while going dynamic is an easy shortcut and initially looks worth it it is in practise nothing more than a way of sloppily cutting corners. And clearly written well-named type definitions are invaluable as documentation too.

What is the state of support for user-defined types in the Plain English compiler? One thing I'm still waiting for mainstream languages to implement is opaque typedefs so that I can clearly say in the code that temperatures, distances and speeds are implemented as "floats" but that floats and these type aliases aren't globally interchangeable so that if I for example by mistake try to calculate the sum of a temperature and a distance the compiler will at least warn me that I'm trying to do something weird and inappropriate.

How do you see the relationship between programming in Plain English and Knuth's literate programming?

Gerry Rzeppa wrote:
The final answers, I believe, will be at least these: some will study the parsing algorithms to learn how to write more certain and precise code; others will restrict themselves to simple and obvious sentences (as they would with small children); others yet will fall back on the testing method (did it do what I said?); still others will be comfortable only with confirmation in the form of a more precise (and more artificial) intermediate language; etc. I would expect that, over time, the compiler(s) will get better at discerning the user/programmer's intent as well.


If everyone were to properly define their types (and the language would support them with the proper tools to do so) I'd suspect that it could help the compilers a long way in better understanding the intent of the programmer.

I'm still not sure what would be the best way to describe types and typing in general. On top of classes we now have mixins so that things can be more flexibly specified but for some reason I don't feel convinced just like I have my doubts about the mainstream object-orientedness too.

Gerry Rzeppa wrote:
You got that right. We were actually trying to develop a readable derivative of FORTH (called FIFTH, as you might expect) when we got into Plain English. We kept working on the FORTH syntax, trying to make it more readable, and one day said to ourselves, "Well, what exactly do we wish we could say there?" And Plain English was born!


Yes, sounds very familiar. Most programming language development nowadays seems to prefer to stick within the confines of the comfort zone of mainstream paradigms. Which is something I find very boring. Languages are just ripping off features from one another and big companies pick up languages from the academics and then after branding try to sell them as something completely new to the industry.

Why not do something fun and possibly innovative instead? Then again even if one manages to find something new and interesting it might take a while for it to get picked up by a wider audience. After all you need to gain enough momentum before you try to push through just like the heliocentric model had to do before it became mainstream. No point in trying to push too hard like Galileo did before the time was ripe. But who am I to go on about this. You probably know all of this way better than I do anyway.
Post 22 Apr 2015, 20:40
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 22 Apr 2015, 22:24
nyrtzi wrote:
Having looked at and thought more about the subject it seems that the only issues left in my mind are: 1) how many rules do I need to remember in order to be able to predict the behavior of the compiler 2) what are the limits for modifying those rules.

A short summary of our language is as follows. Imagine the compiler itself speaking:

Quote:
HOW I WORK

Alrighty then. Here's how I manage to do so much with so little.

(1) I really only understand five kinds of sentences:
(a) type definitions, which always start with A, AN, or SOME;
(b) global variable definitions, which always start with THE;
(c) routine headers, which always start with TO;
(d) conditional statements, which always start with IF; and
(e) imperative statements, which start with anything else.

(2) I treat as a name anything after A, AN, ANOTHER, SOME, or THE, up to:
(a) any simple verb, like IS, ARE, CAN, or DO, or
(b) any conjunction, like AND or OR, or
(c) any preposition, like OVER, UNDER, AROUND, or THRU, or
(d) any literal, like 123 or "Hello, World!", or
(e) any punctuation mark.

(3) I consider almost all other words to be just words, except for:
(a) infix operators: PLUS, MINUS, TIMES, DIVIDED BY and THEN;
(b) special definition words: CALLED and EQUAL; and
(c) reserved imperatives: LOOP, BREAK, EXIT, REPEAT, and SAY.

So you can see that my power is rooted in my simplicity. I parse sentences pretty much the same way you do. I look for marker words — articles, verbs, conjunctions, prepositions — then work my way around them. No involved grammars, no ridiculously complicated parse trees, no obscure keywords.


The complete manual is here: www.osmosian.com/instructions.pdf

Keep in mind that the current system is a prototype, a "proof of concept". We were striving to see how far we could get with as little as possible.

nyrtzi wrote:
How is the language best described? Procedural and imperative? Basically just a layer on top of machine language to keep the whole as simple as possible? This is what it has sounded like at least to me so far.

Yes, procedural and imperative. Essentially just types (which can be extended), variables (local and global), and routines. A lot like FORTH in spirit, PASCAL in implementation.

nyrtzi wrote:
I used to be a big fan of languages which postpone almost all of their typechecking to runtime but as the years have gone by I've become more and more convinced that while going dynamic is an easy shortcut and initially looks worth it it is in practise nothing more than a way of sloppily cutting corners. And clearly written well-named type definitions are invaluable as documentation too.

Teaching others to program, we've found that our tools and techniques change as the programmer develops; that learning to program is not a stroll up a smooth upward incline, but more like mounting a staircase, step by step. At the bottom of the staircase, we can do a lot with something like turtle graphics -- no need to deal with type checking and memory management issues, etc. But when that beginner moves up a step and wants to, say, manipulate one of his awesome turtle-graphic creations with the mouse, we don't just add more turtle-style code. We have to explain how the drawing needs to be saved in memory (and not just displayed on the screen), segment by segment, and how we might recognize a mouse click on any of those segments, and how we might put "handles" on selected segments so they can be moved and resized, etc. The higher step is thus not just more of the same, but something quite different. (A similar example would be static vs editable text.) In any case, we've found that as we proceed up the staircase, things that are relatively unimportant at the bottom (like type checking) become more and more useful -- even necessary.

nyrtzi wrote:
What is the state of support for user-defined types in the Plain English compiler?

Types in Plain English are pretty much at the level of PASCAL, sans arrays (we use linked lists for everything). But you can extend a type to retain partial compatibility with the underlying types. For example, you can say:

A box has a left, a top, a right, and a bottom.
A roundy box has a a left, a top, a right, a bottom and a radius.


(We make the programmer repeat the common fields because we don't like having to "chase down" record definitions.) A "roundy box" is thus type-compatible with a "box" and when a routine (like "Find the center of a roundy box") can't be found, the routine designed for the underlying type ("Find the center of a box") will be called.

nyrtzi wrote:
One thing I'm still waiting for mainstream languages to implement is opaque typedefs so that I can clearly say in the code that temperatures, distances and speeds are implemented as "floats" but that floats and these type aliases aren't globally interchangeable so that if I for example by mistake try to calculate the sum of a temperature and a distance the compiler will at least warn me that I'm trying to do something weird and inappropriate.

In Plain English some of that depends on the programmer. For example, we wanted pointers and numbers (which are always integers to us) to be distinct. Thus, instead of saying:

A number has 4 bytes.
A pointer is a number.


We instead said:

A number has 4 bytes.
A pointer has 4 bytes.


Making pointer and number distinct types. We could (and did), however, make use of basic number arithmetic with pointers via our "employ" clause:

To add a pointer to another pointer: employ add a number to another number.

nyrtzi wrote:
How do you see the relationship between programming in Plain English and Knuth's literate programming?

I sent a copy of our manual to Knuth this past January (he only does snail mail) and he replied with little handwritten notes on many of the pages. We were pleased to hear him say, "This is certainly an elegant proof of concept, and one of the nicest self-describing systems I've ever seen... The writing style is pleasant, and you successfully managed the haiku-like constraint of one topic per page so smoothly that I didn't really notice it until after reading your letter... Bravo!" He later expressed some reservations: "I always thought Grace Hopper's idea of programming in natural language is foolish/dangerous because English is so ambiguous and because we're not accustomed to disciplining ourselves when using it. However, you have successfully demonstrated that a natural subset of natural language actually exists." We unfortunately didn't discuss the relationship between our work and his "literate programming". Though our PERSPECTIVE system, mentioned in an earlier post, did make each routine just another element of a page, and those pages could include other elements (text and graphics) to illustrate, clarify, and otherwise comment on the code. So I think there's a lot of overlap. I'm sometimes tempted to take our Plain English code out of the text file realm and put that on wysiwyg pages like those supported by the Writer we used to document our system.

nyrtzi wrote:
If everyone were to properly define their types (and the language would support them with the proper tools to do so) I'd suspect that it could help the compilers a long way in better understanding the intent of the programmer.

Indeed. Reminds me of that old joke, "And God said to Adam, 'No wife for you until you get the names of all these animals right.' "

nyrtzi wrote:
I'm still not sure what would be the best way to describe types and typing in general. On top of classes we now have mixins so that things can be more flexibly specified but for some reason I don't feel convinced...

We've found the key to simplifying/clarifying types is to "lighten up" a little on the concept of reusable code; in other words, if we have to re-code an add routine (or something) to keep pointers separate from numbers, well, so be it. After all, every cell in our bodies carries a plethora of duplicate code. Another example would be the way we "force" the Plain English programmer to actually include the standard libraries he wants to use in his project (and not just reference them). While this admittedly means correcting mistakes in the libraries in more than one place (which is rare), it also means that a working program will continue working even when a standard library is "improved" (and possibly made incompatible or just plain broken). In short, we think it's preferable to copy a routine and tweak it a little for some special purpose rather than introduce all sorts of hierarchical type complexities and "callbacks" into our programs.

nyrtzi wrote:
...just like I have my doubts about the mainstream object-orientedness too.

We believe that the object-oriented paradigm is fatally flawed in fundamental ways. Hiding verbs (functions) inside nouns (objects) makes no linguistic sense at all. And such an approach forces the program designer to make choices he shouldn't have to make. Case in point: in Plain English we might very simply and naturally define two nouns (bicycle and tire) and a verb (mount) like so:

A bicycle is a thing with some tires...
A tire is a thing with some spokes...
To mount a tire on a bicycle...


But with the object-oriented approach, we'd need to make the mount verb either (a) part of the bicycle, or (b) part of the tire; and we thus end up with either a bicycle mounting its own tires, or a tire mounting itself on a bike -- both ridiculous images (though something we might expect from a biologist like Alan Kay who sets out to re-invent programming). It's good, we think (as with the redundant code I mentioned above), to try to imitate natural processes; but it's bad when we attempt to incorrectly extend animate paradigms to inanimate objects. The computer is not an autonomous being; it is a machine that does what it is told to do. Thus the procedural/imperative model is the correct one:

Mount the tire on the bicycle.

Where the active agent (the computer) is implied as in everyday English imperatives.

nyrtzi wrote:
Yes, sounds very familiar. Most programming language development nowadays seems to prefer to stick within the confines of the comfort zone of mainstream paradigms. Which is something I find very boring. Languages are just ripping off features from one another and big companies pick up languages from the academics and then after branding try to sell them as something completely new to the industry.

Why not do something fun and possibly innovative instead? Then again even if one manages to find something new and interesting it might take a while for it to get picked up by a wider audience. After all you need to gain enough momentum before you try to push through just like the heliocentric model had to do before it became mainstream. No point in trying to push too hard like Galileo did before the time was ripe. But who am I to go on about this. You probably know all of this way better than I do anyway.

All input is appreciated. And you're right that most things, like telling good jokes, is a matter of timing. The trick is to keep an idea alive long enough so it's still breathing when the right time comes.
Post 22 Apr 2015, 22:24
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
codestar



Joined: 25 Dec 2014
Posts: 254
codestar 24 Apr 2015, 14:49
Gerry:
Quote:
it appears you're thinking along the same hybrid lines
Yes, my first thought.

Your language has no names and mine has no types (all int=void* in 32BIT). A solution is to use all variants with .type, 8 or 16 bytes, depending on 32/64BIT. Example: JavaScript. Problems: Slow and wasteful for primitive types, but the time consuming functions - example: clear.screen = loop screen.n, (color) *vga++=c - can be predefined so that the compiler only produces a call. Writing it manually in the language would be too slow. Just one HD screen is 1920x1080 = 2,073,600 32BPP pixels = 8,294,400 bytes! In a game or animation, multiply by FPS.

Another way to reduce types is with variable style parameters:
Code:
// C...

unsigned int rgb(unsigned char r, unsigned char g, unsigned char b);
void set_box(box *b, int x, int y, int w, int h);

// True improvement of C (C++ sucks!)

uint rgb(byte r, g, b)
void set_box(box b, int x, y, w, h)    
Have you seen rosettacode? IMO, best reference for language creators. I like Euphoria.

No time, gotta make $ to survive. Wish I had time to work on Miracle CPU: 2-3 days = Android game.
Post 24 Apr 2015, 14:49
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 24 Apr 2015, 17:31
codestar wrote:
...time consuming functions - example: clear.screen = loop screen.n, (color) *vga++=c - can be predefined so that the compiler only produces a call...

Sure. That's the whole idea behind "hybrid" programming: the right tool for each job.

codestar wrote:
Your language has no names and mine has no types...

The problem I have with typeless languages is that it forces us to give names to things that we would normally identify by type (a box, the screen, some pixels). It also forces us to qualify verbs unnaturally. For example, in Plain English we can say:

Draw a rectangle.
Draw a circle.


where "a rectangle" and "a circle" serve, very naturally, as both typed parameters and as part of the routine names. In an untyped language, we'd need to say something like:

draw_rectangle(r);
draw_circle(c);


where we not only have to qualify the function names, but lose the advantages of type-checking to boot.

Incidently, the object-oriented approach makes use of the type, which is good, but does so in such an unnatural way that it conjures up ridiculous pictures in one's head:

rectangle.draw();
circle.draw();


Are we really asking that rectangle to draw itself? Sounds like a language M. C. Escher would come up with!

codestar wrote:
Another way to reduce types is with variable style parameters:
Code:
uint rgb(byte r, g, b)
void set_box(box b, int x, y, w, h)    

I'm pretty sure Pascal has always supported that style of parameter.

codestar wrote:
Have you seen rosettacode? IMO, best reference for language creators.

Yes. Great site. But as I mentioned in another post, reading through Rosettacode brings two thoughts immediately to mind: (1) the overwhelming arbitrariness of most artificial syntaxes, and (2) how the examples that are easiest to understand are those that are most English-like.

codestar wrote:
I like Euphoria.

I agree that Euphoria has a lot of good ideas in it.
Post 24 Apr 2015, 17:31
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 26 Apr 2015, 04:55
Gerry Rzeppa wrote:
We believe that the object-oriented paradigm is fatally flawed in fundamental ways. Hiding verbs (functions) inside nouns (objects) makes no linguistic sense at all.


Yep, no linguistic sense but it kind of makes sense inside the problem domain the first OO languages were designed and used for. To me the usual class-based OOP just looks like the same kind of compromise on the idea of OOP like SQL is of the idea of a relational database.

Then again what is OOP? There are multiple definitions around but none of them seems definitive.

Gerry Rzeppa wrote:
Thus the procedural/imperative model is the correct one


I find myself thinking along the same lines too. Having to simulate the code inside one's head is mentioned as a downside of imperative programming by Backus but then again by changing to something more supposedly declarative or whatever just seems to change the model you need to simulate in your head instead of eliminating the need to simulate.

I kind of understand why people want inheritance and dynamic dispatch. The constraints one is required to follow because of how those have been implemented is a pain though even if people seem unaware of the hoops they're having to jump because of them. Then again if they were completely unaware of the limitations they wouldn't have added multiple inheritance, mixins, monkey patching, etc to their languages to get out of their straitjackets.

I don't know how much most ordinary programmers think of this though. I know a lot of programmers who don't seem to think about these things at all. Instead they just happily code along until they get into trouble with their overgrown single inheritance class hierarchies which are too deep and complex after which they wonder how they can reuse code from one class to another one in a different branch of the hierarchy without having to duplicate it and without rewriting and retesting all parent classes up to the shared parent class. And then when they are given the possibility of using multiple inheritance, mixins, etc they jump at it like it were a silver bullet without thinking about the kind of mess the use of those features can cause in the codebase if used without proper consideration. And this doesn't yet even cover the situation of people mixing in also all kinds of magic through language extensions, metaobject protocols, preprocessors, etc which turn the codebase as a whole to something which is next to impossible to understand and debug without an extensive set of tools.

Was it Hoare back decades ago who talked about his own experiences regarding trying to run and manage a successful software project without being able understand the program nor the tools used and ending up saying that not understanding those things is ultimately the same as not knowing what you're doing? It's nice to be able to understand exactly what is going on both in the higher and lower layers and being able to go in and fix it if necessary.

To me it seems that types and procedures are easier to use and reuse when they aren't tied to each other. Perhaps one major motivation to keep procedures together and tied to a piece of shared data is to make information hiding easy and to make bypassing it for the methods of the class. Then again I think that this usual way of doing it is just too simplistic and that it could possibly be done better without the constaints of class-oriented ideology.

Which is kind of why I find myself to be more in the camp of multiple dispatch and procedure overloading combined with variants. This will most likely lead into the issue of the compiler and runtime having to pick the right procedure from the set of applicable ones and thus will require some sort of a language construct for the programmer to be able to specify the correct ordering between candidate procedures. Or there will have to be CLOS-style generic functions and the programmer will be required to pick the desired method combination style if not happy with the default one whatever it happens to be.

I've written a bit on this board earlier about the latest toy language I've been working on every now and then but not many seem to be interested in something along the lines of a statically typed Lisp derivative to C translator which just tries to implement some of my favorite ideas in a single language which is nothing more than a thin layer on top of C attempting to offer a more pleasant and modern experience along with the metaprogramming possibilities which come with homoiconicity.
Post 26 Apr 2015, 04:55
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 26 Apr 2015, 06:59
nyrtzi wrote:
I find myself thinking along the same lines too. Having to simulate the code inside one's head is mentioned as a downside of imperative programming by Backus but then again by changing to something more supposedly declarative or whatever just seems to change the model you need to simulate in your head instead of eliminating the need to simulate.

Seems to me that people have been solving all kinds of problems for millennia in more-or-less procedural, imperative ways. Do this to that, then do something else, etc. It appears to be the way we normally and naturally think.

nyrtzi wrote:
I kind of understand why people want inheritance and dynamic dispatch. The constraints one is required to follow because of how those have been implemented is a pain though even if people seem unaware of the hoops they're having to jump because of them. Then again if they were completely unaware of the limitations they wouldn't have added multiple inheritance, mixins, monkey patching, etc to their languages to get out of their straitjackets.

Seems to me that switching to an entirely different and unnatural paradigm just to avoid copying a few lines of code here, and whipping up a little dispatcher there, is a very high price to pay.

nyrtzi wrote:
I don't know how much most ordinary programmers think of this though. I know a lot of programmers who don't seem to think about these things at all. Instead they just happily code along until they get into trouble with their overgrown single inheritance class hierarchies which are too deep and complex after which they wonder how they can reuse code from one class to another one in a different branch of the hierarchy without having to duplicate it and without rewriting and retesting all parent classes up to the shared parent class. And then when they are given the possibility of using multiple inheritance, mixins, etc they jump at it like it were a silver bullet without thinking about the kind of mess the use of those features can cause in the codebase if used without proper consideration. And this doesn't yet even cover the situation of people mixing in also all kinds of magic through language extensions, metaobject protocols, preprocessors, etc which turn the codebase as a whole to something which is next to impossible to understand and debug without an extensive set of tools.

To paraphrase Joe's closing statement here ( http://harmful.cat-v.org/software/OO_programming/why_oo_sucks ): If a language technology is so bad that it spawns a whole new industry to solve problems of its own making then it must be a bad idea.

nyrtzi wrote:
Was it Hoare back decades ago who talked about his own experiences regarding trying to run and manage a successful software project without being able understand the program nor the tools used and ending up saying that not understanding those things is ultimately the same as not knowing what you're doing? It's nice to be able to understand exactly what is going on both in the higher and lower layers and being able to go in and fix it if necessary.

That's exactly why we tried to code the whole shebang, top to bottom, using the most obvious syntax with just a handful of simple, procedural constructs. And, lo and behold, we were actually able to do so, conveniently and efficiently. Conclusion? All that other stuff must be nothing more than decoration (especially the tawdry stuff).

nyrtzi wrote:
To me it seems that types and procedures are easier to use and reuse when they aren't tied to each other.

Yes. Nouns are one thing, verbs are quite another; neither should be unnaturally forced to be "part of" or "inside of" or "subordinate to" the other.

nyrtzi wrote:
Perhaps one major motivation to keep procedures together and tied to a piece of shared data is to make information hiding easy and to make bypassing it for the methods of the class. Then again I think that this usual way of doing it is just too simplistic and that it could possibly be done better without the constaints of class-oriented ideology.

Information hiding is often more harmful than beneficial. Assuming it's always a good thing is part of where the unclear thinking beings.

nyrtzi wrote:
Which is kind of why I find myself to be more in the camp of multiple dispatch and procedure overloading combined with variants.

In Plain English any verb can be "overloaded" simply by following it up with different nouns, for example:

Draw a rectangle.
Draw a circle.
Draw a horse.


Dispatching to the appropriate routine (based on the parameter types) is automatic. Run-time dispatching based on data values, of course, requires hand-written dispatchers. I mentioned above how we included record "extensions" to handle variants, but looking back I think in most cases it was more trouble than it was worth.

nyrtzi wrote:
This will most likely lead into the issue of the compiler and runtime having to pick the right procedure from the set of applicable ones and thus will require some sort of a language construct for the programmer to be able to specify the correct ordering between candidate procedures. Or there will have to be CLOS-style generic functions and the programmer will be required to pick the desired method combination style if not happy with the default one whatever it happens to be.

I'm not sure more machinery (that inevitably begets still more machinery) is the answer. Perhaps we should all just bite the bullet and copy a little code here, whip up a little dispatcher there.

nyrtzi wrote:
I've written a bit on this board earlier about the latest toy language I've been working on every now and then but not many seem to be interested in something along the lines of a statically typed Lisp derivative to C translator which just tries to implement some of my favorite ideas in a single language which is nothing more than a thin layer on top of C attempting to offer a more pleasant and modern experience along with the metaprogramming possibilities which come with homoiconicity.

I tried to find a thread authored by you on the subject here but failed. Got a link?
Post 26 Apr 2015, 06:59
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 26 Apr 2015, 16:27
Sorry for responding so slowly. I'm a rather slow writer.

Gerry Rzeppa wrote:
I tried to find a thread authored by you on the subject here but failed. Got a link?


The only thread I was able to find was http://board.flatassembler.net/topic.php?t=15707 which covers something but is rather outdated. At the moment it's probably been a year since I last touched the codebase as I've been reading here and there while being stuck trying to decide which way to go with the design.

Better for me to just explain here what I've been thinking of but I guess the main points were already mentioned. Reasonably simple Lisp syntax with a few added ideas to help with readability, strong and static type system, namespaces, dynamic dispatch and variant types. Those are the main ingredients. I'm not thinking of something which would compete with scripting languages but instead something I could use instead of C and C++ for writing non-trivial backend software with a codebase larger than what fits into my head. I want the language to help me avoid breaking things while maintaining code and I dislike the approach of leaving it completely to a test suite. After all errors should be caught as early as possible. In other words I've tried all the other tools which have a significant amount of usable libraries available and they've fallen short of my expectations so I'm trying to cook up something which would work better for me.

A piece of example code I dug up from an old .txt file where I've tried to flesh out a specification:

Code:
(do {
    (:= x "abc") # assign the string "abc" into the variable 'x
    (var [str y]) # declare variable 'y with the type of 'str
    (if (regex:match x "/asdf/" y) { # match 'x against the regular expression and put the possible matches into 'y
      (var [str i])
      (for (i y) {
        (print i)
      })
    })
})
    


The standard Lisp syntax using parenthesis.

The curly braces are used as markers in the syntax so that source code formatting tools have enough information to things right automatically so that people can format the code in any way they want while they're working on it and it can be automatically turned into some standard format when committed into a repository.

The brackets are for specifying an expression's type. I'm still wondering if I want to infer variable types automatically or if they should be declared explicitly.

Gerry Rzeppa wrote:
Seems to me that switching to an entirely different and unnatural paradigm just to avoid copying a few lines of code here, and whipping up a little dispatcher there, is a very high price to pay.


I don't think anyone objects to copying a few trivial lines but understandably they are less willing to keep around multiple copies of a complex segment of code they've put enough time and effort into for it to feel significant and something which by itself needs testing and maintenance. They should probably refactor their codebase in that kind of a situation to put it into a place from which it can be easily used and where it can be separately maintained as a single copy.

The problem is that classes get their reuse done by introducing relationships and thus dependencies in a way which makes code less modular. Yes, modern tools can probably do the refactoring almost if not completely automatically but the tighter the method was coupled with the class the more the programmer will need to check and test for breakage. However not everyone knows enough about the subject to do this the easiest way.

Gerry Rzeppa wrote:
If a language technology is so bad that it spawns a whole new industry to solve problems of its own making then it must be a bad idea.


My view (this is a very rough sketch of the landscape which I just formed on the spot) on this is that there are the software artisans, the researchers and the industry workers. Software companies fall between workshops run by artisans and large industrial construction sites or factories. Chuck Moore is a classic example of an artisan and his tools like ColorForth and others speak of his needs and values. Then there are the hardcore academic researchers who write ML, Haskell and so on and emphasize type systems, formal proofs and math in everything they say and do. Last there are the software industry workers who try to reliably build software projects like engineers build bridges with a pre-planned budget and project plan with its deadlines and timeframes.

OOP is an industry product by the industry for the industry so that software workers can be safely put to work on the same large codebase without having to worry that they can poke things they shouldn't have access to. If you look at the industry programming languages it looks rather obvious why they've evolved like they have. Each iteration is more dumbed down and safer than the previous one so that no one working on the codebase can step on anyone's toes.

So I don't think that OOP spawned a new industry. It's just another tool in the software industry's toolbox. Hyped up and not as effective as advertised but still a step further into the direction big business wants the software industry to go to. The final goal of the industry is probably that of making all programmers easily replaceable components of a software production factory.

However it seems obvious why for the researcher and the artisan OOP doesn't look like a very attractive idea.

Gerry Rzeppa wrote:
Conclusion? All that other stuff must be nothing more than decoration


Looking at your achievement from my direction it seems like you made a great tool for doing the job you set out to do. That reminds me of Chuck Moore and what he did. It looks at the same time like an artisan's tool and another step into seeing if programming can be made more accessible to the general public through language. Impressive work but I don't think it meets the requirements of the industry nor of mainstream academic programming language research which seems to obsess about functional toy languages. What I'm trying to say is that I doubt Plain English would scale for example for a large modern industrial software project with sloppy "I just work here" kind of programmers messing around with the code.

I've been looking at the code of your compiler and so on but the language just doesn't feel like something I can instantly pick up and understand even after reading the instructions. Maybe I'm just not bright enough or something but I suspect that even in this what feels natural is a matter of what you're used to. It just feels like reading some assembly language dialect which has a syntax I'm not used to yet.

Gerry Rzeppa wrote:
Information hiding is often more harmful than beneficial. Assuming it's always a good thing is part of where the unclear thinking beings.


IIRC the Common Lisp Object System for example doesn't do hiding at all but works just fine. Either way the mechanism I was thinking of achieving information hiding when it is needed was that of creating opaque aliases of a record type with the possibility of choosing separately for each field if it is to be made hidden or read only. After that it would just be a matter of passing the record instances through variables of appropriate alias types to make sure that programmers don't corrupt or misuse fields by accident. With traditional OOP you get to define the properties of an object as private or const only once and those definitions apply globally which makes things less flexible than I'd prefer. I'd like to be able to finetune the constraints based on what is accessing the data.

Gerry Rzeppa wrote:
Dispatching to the appropriate routine (based on the parameter types) is automatic. Run-time dispatching based on data values, of course, requires hand-written dispatchers


You say "types" so I'm assuming multiple dispatch. I'd have the compiler generate the code for runtime dispatching when such is necessary. I've been thinking that in my language variants would automatically support any operation supported by all of its members which kind of limits the amount of operations which need runtime dispatchers as I'd assume that apart from a few common operations there wouldn't be much overlap in the operations supported. My aim is to have as little dynamic dispatching as possible but I'd still like it to be something which just works automatically without me having to pay any attention to it so that I can concentrate on the actual work at hand.

However the dynamic dispatch is just a convenience while my actual goal is a little different. I want the compiler to scream bloody murder if I make any sort of an error with the types for example by trying to invoke an operation on a variant when it isn't supported by all of its members. In that case I'd have to use a typecase to split the variant's set of members into the ones which support the operation and those which don't and invoke the operation only on the ones which do support it. And I specifically would want to do this with a typecase control structure instead of using a plain switch-case or an if-else statement because I want to make sure that I can't make a mistake by first testing the value's type and then by accident treating it as something else.

Yes, this is starting to sound more and more like an industry-oriented language to me too. Perhaps I'm just reacting to the languages I see in use in the industry and their issues. I don't see why I'd be the one getting the solution to the "make a better C" problem right though when so many have tried before me. I just want a language which doesn't feel too bloated, complex, sloppy and restricting.
Post 26 Apr 2015, 16:27
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 26 Apr 2015, 18:44
nyrtzi wrote:
A piece of example code I dug up from an old .txt file where I've tried to flesh out a specification:

Code:
(do {
    (:= x "abc") # assign the string "abc" into the variable 'x
    (var [str y]) # declare variable 'y with the type of 'str
    (if (regex:match x "/asdf/" y) { # match 'x against the regular expression and put the possible matches into 'y
      (var [str i])
      (for (i y) {
        (print i)
      })
    })
})
    


The standard Lisp syntax using parenthesis.

The curly braces are used as markers in the syntax so that source code formatting tools have enough information to things right automatically so that people can format the code in any way they want while they're working on it and it can be automatically turned into some standard format when committed into a repository.

The brackets are for specifying an expression's type. I'm still wondering if I want to infer variable types automatically or if they should be declared explicitly.

It's clear you've put a lot of thought into this, but -- at the risk of sounding unnecessarily discouraging -- I don't see how the result is anything but just another unintuitive syntax that has to be learned before it can be used; just another syntax where the thoughts in our heads have to be translated before they can be written down.

I suspect the direction of approach ("I want to improve Lisp, or C, or whatever") is the problem; attacking the problem from the other direction ("I want to say what I'm thinking and have the computer understand it") seems, at least to me, to bear more edible fruit. As I mentioned in an earlier post, we were actually trying to improve FORTH when we came up with Plain English; like you, we kept trying to "invent" a simple, consistent yet flexible syntax -- we tried postfix, prefix, infix, all kinds of ideas -- when it dawned on us that just writing our thoughts down (rather than "translating" them) is what we really wanted to do. The fact that people tend to "automatically" fall back on pseudocode to address problems that are stumping them is proof, we think, that it's easier to think without the additional "translation" step. So let's just make the pseudocode understandable to the computer...

Case in point. You say:

(:= x "abc")

and describe the thought that's in your head with the comment:

# assign the string "abc" into the variable 'x

Why not simply say:

Put "abc" into a string.

And be done with it?

nyrtzi wrote:
My view (this is a very rough sketch of the landscape which I just formed on the spot) on this is that there are the software artisans, the researchers and the industry workers. Software companies fall between workshops run by artisans and large industrial construction sites or factories. Chuck Moore is a classic example of an artisan and his tools like ColorForth and others speak of his needs and values. Then there are the hardcore academic researchers who write ML, Haskell and so on and emphasize type systems, formal proofs and math in everything they say and do. Last there are the software industry workers who try to reliably build software projects like engineers build bridges with a pre-planned budget and project plan with its deadlines and timeframes.

That is a remarkably concise and (I believe) accurate summary of the situation. Well done!

nyrtzi wrote:
OOP is an industry product by the industry for the industry so that software workers can be safely put to work on the same large codebase without having to worry that they can poke things they shouldn't have access to... Hyped up and not as effective as advertised but still a step further into the direction big business wants the software industry to go to. The final goal of the industry is probably that of making all programmers easily replaceable components of a software production factory.

That may be the way it has worked out, but I don't think that was the intent of OOP when Alan Kay first started promoting the idea.

nyrtzi wrote:
If you look at the industry programming languages it looks rather obvious why they've evolved like they have. Each iteration is more dumbed down and safer than the previous one so that no one working on the codebase can step on anyone's toes.

If by "dumbed down" you mean, "less capable", I agree. But today's languages are certainly not "dumbed down" in the sense that they are easy to learn and use.

nyrtzi wrote:
Looking at your achievement from my direction it seems like you made a great tool for doing the job you set out to do. That reminds me of Chuck Moore and what he did. It looks at the same time like an artisan's tool and another step into seeing if programming can be made more accessible to the general public through language.

Our goal is not only to reach the general public, though that's included; it's also to make things easier for the next generation of professionals who will be able to simply write down their thoughts like the author of a math book -- a natural language framework with snippets of specialized syntax where appropriate.

nyrtzi wrote:
Impressive work but I don't think it meets the requirements of the industry nor of mainstream academic programming language research which seems to obsess about functional toy languages.

Certainly not as it stands. And you're absolutely right that it currently falls into the "artisan" category and is thus not appropriate for industry and not of interest to the academics. But don't the real advances typically start out in the artisan community? -- people who not only research, but do? da Vinci comes to mind...

nyrtzi wrote:
What I'm trying to say is that I doubt Plain English would scale for example for a large modern industrial software project with sloppy "I just work here" kind of programmers messing around with the code.

Agreed. I think that's why our work is so often misunderstood. People assume that that is our target audience because of the simple and familiar syntax we use; but we're really looking for serious programmers who are still novice enough to think in "ordinary" and "natural" ways. In short, we're after the next generation of professionals. We'd like to step back, as it were, and see what computers would have looked like if they had been invented by philosophers rather than mathematicians.

nyrtzi wrote:
I've been looking at the code of your compiler and so on but the language just doesn't feel like something I can instantly pick up and understand even after reading the instructions.

That's the wrong way to approach it. The worst way to learn a language is to study it; the best (and natural) way is to simply be exposed to it (like a small child in the bosom of a normal family). That's why we recommend -- and I know it sounds tedious and childish -- that you print off the first 54 pages of our manual and actually type in the sample program. Plain English programming uses different parts of the brain than traditional programming; it's more like writing a post on this forum than writing code. But you won't discover that studying it; you have to actually try writing it, first by imitating the "parents" who know and understand the language, then by "writing extemporaneously" once -- and only once -- you've got the hang of it. It really feels different.

nyrtzi wrote:
Maybe I'm just not bright enough or something but I suspect that even in this what feels natural is a matter of what you're used to.

It's not a matter of intelligence; it's a matter of "unlearning" artificial ways of thinking. It's a matter of "getting back to one's roots".

nyrtzi wrote:
It just feels like reading some assembly language dialect which has a syntax I'm not used to yet.

Yet we know that's not the actual case. Surely the syntax of the statements like those below is not unfamiliar to you:

Start up.
Initialize our stuff.
Handle any events.
Finalize our stuff.
Shut down.


You may not immediately understand, in technical terms, what those statements are doing or how they're doing it, but surely you understand the intent of those statements. I suspect it's your desire, as an accomplished programmer, to relate those statements to all the technical stuff you know about programming that makes it difficult to appreciate the simple beauty of the thing. It's as if a small child was trying to think about nouns and verbs and adjectives when first learning to speak. The problem isn't that that you're "not bright enough" -- it's that you know too much.

nyrtzi wrote:
I just want a language which doesn't feel too bloated, complex, sloppy and restricting.

I really wish I could get you to actually learn Plain English -- print off the instructions and type in and run the sample code. After that, I suspect, you'd say something like, "Okay, I see what you're saying now. But what I really wanted to say on page 23 was..." and we'd be on to discussing the natural hybrid that is our ultimate goal.
Post 26 Apr 2015, 18:44
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 26 Apr 2015, 21:07
Gerry Rzeppa wrote:
It's clear you've put a lot of thought into this, but -- at the risk of sounding unnecessarily discouraging -- I don't see how the result is anything but just another unintuitive syntax that has to be learned before it can be used; just another syntax where the thoughts in our heads have to be translated before they can be written down.


Yes, that is true. And possibly a long time until one has put so much time into it that is has become "second nature" which is of course something beginners can't do right off the bat and which requires investing time into it. However there is a technical reason for picking this syntax despite the downside you mentioned. The usual benefit of homoiconic Lisp syntax i.e., the metaprogramming capabilities you get in return because the source code pretty much shows you directly what the parsed abstract syntax tree is and because the language is engineered to handle those data structures as one of its primary building blocks.

Now considering the context of this whole discussion this leads me to ask if it could be possible to get this same kind of homoiconicity into Plain English to give it a boost in metaprogramming capabilities. How well does Plain English support the idea of interchangeability between code and data in the sense that code could be treated as data and data as code using built in data types? The Lisp community has been writing program writing and manipulating programs for ages, the Go community has the gofix tool which updates old code to be compatible with current practices and the current language codebase and the Python community also had some tool for automatically updating Python 2.x code to be mostly compatible with Python 3.x. Is this something Plain English could support too and do it well? Just asking. I find this kind of stuff interesting.

Gerry Rzeppa wrote:
That may be the way it has worked out, but I don't think that was the intent of OOP when Alan Kay first started promoting the idea.


Things rarely turn out the way people originally planned. An old internet saying has it that Kay once noted that: "I invented the term object-oriented, and I can tell you that C++ wasn't what I had in mind."

Gerry Rzeppa wrote:
But today's languages are certainly not "dumbed down" in the sense that they are easy to learn and use.


I'll borrow an old quote here directly:

John Backus (Communications of the ACM August 1978) wrote:

Programming languages appear to be in trouble. Each successive language incorporates, with a little cleaning up, all the features of its predecessors plus a few more. Some languages have manuals exceeding 500 pages; others cram a complex description into shorter manuals by using dense formalisms. The Department of Defense has current plans for a committee-designed language standard that could require a manual as long as 1,000 pages. Each new language claims new and fashionable features, such as strong typing or structured control statements, but the plain fact is that few languages make programming sufficiently cheaper or more reliable to justify the cost of producing and learning to use them.


As a sidenote I should mention that the two most recommended C++ books "The C++ Programming Language" and "The C++ Standard Library: A Tutorial and Reference" are both over 1000 pages long and combined over 2500 pages. Of course this doesn't directly prove anything but it seems undeniable that C++ is a big and complex beast which takes a long time to master.

Gerry Rzeppa wrote:
Our goal is not only to reach the general public, though that's included


I don't know how it's out there in bigger countries but here in the Nordic countries they're already seem to be putting programming into elementary school curriculum so that little kids are taught systematic thinking through programming exercises straight after they've learned to write their own name or that's what is has sounded like based on the what's been said in the news. Perhaps these little kids could get an easier start into programming through a translated version of Plain English.

Gerry Rzeppa wrote:
But don't the real advances typically start out in the artisan community?


The industry usually doesn't seem to be interested in an idea until it has already been proven valuable and suitable for mass production and the academia suffers from the "not invented here" syndrome and looks down on everything without an academic pedigree. So yes, I'm somewhat inclined to agree. Revolutions in thinking usually don't start from circles dominated by normal science and industry best practices.

Gerry Rzeppa wrote:
That's the wrong way to approach it. ... It's not a matter of intelligence; it's a matter of "unlearning" artificial ways of thinking. It's a matter of "getting back to one's roots".


Unlearning is a hard thing to do. I've been practising it for the most of my adult life and it still feels as difficult as it probably was in the beginning.
Post 26 Apr 2015, 21:07
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 26 Apr 2015, 22:54
nyrtzi wrote:
Now considering the context of this whole discussion this leads me to ask if it could be possible to get this same kind of homoiconicity into Plain English to give it a boost in metaprogramming capabilities.

It seems, again, that you're coming at the whole problem from the opposite direction. You're assuming that "homoiconicity" and "metaprogramming" are desirable things. But are they? Have you and I found such things necessary to communicate with one another on this forum? No. And we're doing the kind of communicating that the AI folks can only dream of! Yet we're doing it without most of the constructs those folks think desirable and even necessary.

nyrtzi wrote:
How well does Plain English support the idea of interchangeability between code and data in the sense that code could be treated as data and data as code using built in data types?

The programmer obviously thinks of his Plain English sentences as code; but our compiler, just as obviously, thinks of his sentences as data to be transformed into something more fundamental to the machine. But in general, at any particular level of abstraction, I think I would be against confusing the two. Nouns are one thing, verbs are quite another; the thing we're doing, and the thing(s) were doing it to or with, should not ordinarily be confused. At the risk of sounding disrespectful, a similar wandering into the tangential by computer scientists can be seen in LISP's "fetish" regarding recursion (in place of ordinary and easy-to-understand loops).

nyrtzi wrote:
The Lisp community has been writing program writing and manipulating programs for ages, the Go community has the gofix tool which updates old code to be compatible with current practices and the current language codebase and the Python community also had some tool for automatically updating Python 2.x code to be mostly compatible with Python 3.x. Is this something Plain English could support too and do it well? Just asking. I find this kind of stuff interesting.

Natural languages, we believe, are more immune to the "version" difficulty than artificial languages. I'm able to understand English sentences that were written centuries ago as well as sentences written today. Sure, there are some differences in vocabulary and grammar, but the "sloppy parsing" in my head seems to deal with those quite effectively. Ditto for various dialects of English today. It is expected that a mature Plain English compiler would do the same, in much the same way. For instance, definitions like these...

To Clear the screen... \ version 1
To Paint the screen all black: employ Clear the screen. \ kid's version
To Efface yon visual exhibition device: employ Clear the screen. \ 16th century pilgrim version
To Get all that sh*t off my m*f*ing screen: employ Clear the screen. \ rap version


...allow our compiler to properly "understand" and dispatch a wide variety of English dialects. And we would encourage Plain English programmers to develop even more such equivalents, so that any remotely reasonable way of expressing the thought could be processed. A brute-force approach, to be sure, but we've found that while a lot of brute-force doesn't necessarily result in artificial intelligence, it can, and often does, result in pretty impressive apparent intelligence.

nyrtzi wrote:
I'll borrow an old quote here directly:
John Backus (Communications of the ACM August 1978) wrote:

Programming languages appear to be in trouble. Each successive language incorporates, with a little cleaning up, all the features of its predecessors plus a few more. Some languages have manuals exceeding 500 pages; others cram a complex description into shorter manuals by using dense formalisms. The Department of Defense has current plans for a committee-designed language standard that could require a manual as long as 1,000 pages. Each new language claims new and fashionable features, such as strong typing or structured control statements, but the plain fact is that few languages make programming sufficiently cheaper or more reliable to justify the cost of producing and learning to use them.

Exactly. That's why we recommend, not a proliferation of features (which have to be learned and ought to be properly used), but rather a proliferation of naturally-occurring means whereby a handful of essential and stable features can be employed and invoked (as above).

nyrtzi wrote:
I don't know how it's out there in bigger countries but here in the Nordic countries they're already seem to be putting programming into elementary school curriculum so that little kids are taught systematic thinking through programming exercises straight after they've learned to write their own name or that's what is has sounded like based on the what's been said in the news.

My own personal experience has been that most kids prior to puberty are really not ready for programming. There are exceptions, of course.

nyrtzi wrote:
Perhaps these little kids could get an easier start into programming through a translated version of Plain English.

I think so, yes, obviously. But again, I would personally not try to force such things on younger children. Better that they get intimately familiar with tangible manipulatives before they're thrown into a universe with fewer constraints. I'm not against dreamers, of course, but I do feel that practical thinking should form the basis of creative thought.
Post 26 Apr 2015, 22:54
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
nyrtzi



Joined: 08 Jul 2006
Posts: 192
Location: Off the scale in the third direction
nyrtzi 02 May 2015, 17:49
Stumbled upon something which syntactically kind of reminds me of Plain English.

http://en.wikipedia.org/wiki/JOSS

Especially the transcript of the terminal session just cracks me up.

http://upload.wikimedia.org/wikipedia/commons/1/10/JOSS_Session.jpg

Eh? SORRY. Eh? Laughing
Post 02 May 2015, 17:49
View user's profile Send private message Reply with quote
Gerry Rzeppa



Joined: 10 Apr 2015
Posts: 37
Location: Franklin, KY
Gerry Rzeppa 02 May 2015, 21:26
nyrtzi wrote:
Stumbled upon something which syntactically kind of reminds me of Plain English.

http://en.wikipedia.org/wiki/JOSS

Especially the transcript of the terminal session just cracks me up.

http://upload.wikimedia.org/wikipedia/commons/1/10/JOSS_Session.jpg

Eh? SORRY. Eh? Laughing

Love it. I especially like the two-color ribbon idea; we often do a similar two-color thing with "console" programs now. The dream lives on...
Post 02 May 2015, 21:26
View user's profile Send private message Send e-mail Visit poster's website Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  
Goto page Previous  1, 2, 3, 4, 5  Next

< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum


Copyright © 1999-2025, Tomasz Grysztar. Also on GitHub, YouTube.

Website powered by rwasa.