flat assembler
Message board for the users of flat assembler.
Index
> Windows > Windows98 ring0 mode? Goto page 1, 2 Next |
Author |
|
Tomasz Grysztar 29 Sep 2018, 20:15
Being in ring 0 and being able to call 16-bit BIOS interrupts are different things. You have no access to these interrupt functions from Win32 subsystem.
You can, however, program things like VGA and keyboard directly with IN/OUT instructions in Win9x, I recall doing this back in the day just for fun. Obviously this is destructive to system stability, though. |
|||
29 Sep 2018, 20:15 |
|
DimonSoft 29 Sep 2018, 20:30
Ben321 wrote: I've heard that in Windows98 (unlike newer versions of Windows) that all programs (not just the OS) run in ring0 protection level, which is to say that all programs have direct access to hardware. That is all programs running in Windows98 can uses IN, OUT, and INT opcodes which in newer versions of Windows can only be done by drivers (everything else runs in ring3, which cannot directly access hardware). This doesn’t seem to be true. Say, even this article on TechNet suggests that TechNet wrote: * All Ring 0 components reside in the address space above 3 GB. Which implies that Win32 applications are not Ring 0 components. In fact, it would be nonsense from security point of view. Ben321 wrote: So here's my question. Why is INT 0x10 not working. My code that I'm testing is the following: You seem to be confused by some bad information sources or a complete mess in your head caused by them. My guess is that you’re trying to write a 32-bit Windows application but your book about assembly programming was for MS-DOS. The book seems to be a bad one because it hasn’t taught you to use documentation when calling functions. If you did, you’d find out that there’s no such stuff as INT 10H in Windows programming. And that the 320x200 mode requires a different value in AX, and that it’s only AX part that is used, not the whole EAX. INT 10H is implemented by legacy BIOS and is only available in real mode which is not the case for Windows 98 running in protected mode. So, even if you could directly run such an instruction, you would have got a runtime error because (a) the BIOS code behind this interrupt vector is 16-bit, and (b) the IDT would point to completely irrelevant interrupt handler for this vector (Math fault, I guess). Quote: So my question is, what opcodes can I use to force my program into ring0 mode, so that I can then use more powerful opcodes like INT? You shouldn’t. It’s not the way Win32 applications work. Just ask the OS to perform all the necessary stuff. Would be funny if any application required switching to Ring 0, which is equivalent to being able to drop the whole system. |
|||
29 Sep 2018, 20:30 |
|
Tomasz Grysztar 29 Sep 2018, 20:36
DimonSoft wrote: In fact, it would be nonsense from security point of view. You could hang the entire system just by writing this into your Win32 program: Code: cli jmp $ |
|||
29 Sep 2018, 20:36 |
|
revolution 30 Sep 2018, 01:18
ISTR remember that you can run 16-bit com programs in Win9x and all those lovely BIOS interrupts would work as expected.
|
|||
30 Sep 2018, 01:18 |
|
Ben321 30 Sep 2018, 06:12
revolution wrote: ISTR remember that you can run 16-bit com programs in Win9x and all those lovely BIOS interrupts would work as expected. what's ISTR mean? |
|||
30 Sep 2018, 06:12 |
|
revolution 30 Sep 2018, 06:16
I Seem To Remember
And now I see that I was redundantly redundant above by repeating remember. |
|||
30 Sep 2018, 06:16 |
|
Ben321 30 Sep 2018, 06:20
DimonSoft wrote:
I cleared the whole EAX just in case calling an interrupt from within a 32bit program behaved differently (using all 32 bits instead of only 16). Also, you are correct, that was a mistake in my code, not setting it to mode 13h. The code should have been. Code: mov eax,0 mov al,13h int 10h Still the result should have been a full glitching of the system as it tried to pump 800x600 16bit images of the desktop into a video mode that was NOT 800x600, and in fact was actually a text mode (my previous mistaken code would have put it into a text mode, not a graphics mode). As for the difference between 16bit real mode, and 32bit protected mode, interupts. I assume the 32bit protected mode interrupts are different for each OS, so there are ones unique to Windows98SE. So does anybody have a list of these for Windows98SE? Also it MUST run applications in ring0. If not, then the IN and OUT opcodes would not work. Those can only be called from ring0 code. And to the best of my knowledge, applications can use the IN and OUT opcodes in Windows98SE. |
|||
30 Sep 2018, 06:20 |
|
revolution 30 Sep 2018, 06:23
Ben321 wrote:
Code: mov eax,13h int 10h |
|||
30 Sep 2018, 06:23 |
|
revolution 30 Sep 2018, 06:27
Ben321 wrote: As for the difference between 16bit real mode, and 32bit protected mode, interupts. I assume the 32bit protected mode interrupts are different for each OS, so there are ones unique to Windows98SE. So does anybody have a list of these for Windows98SE? |
|||
30 Sep 2018, 06:27 |
|
Ben321 30 Sep 2018, 07:09
revolution wrote:
When I use SetPixel API function in Windows, the SetPixel function itself is in a DLL file, then the internals of that DLL file communicate with the driver somehow (even though they are in separate rings, the DLL is in ring3 with the usermode program that loaded the DLL, while the driver is in ring0 and thus is kernel mode), and then the driver either uses calls to INT (interrupt) or OUT (IO port output) to communicate the pixel data to the actual hardware (the graphics card) in order to set the pixel on the screen to the desired color. Can somebody here explain to me the actual interaction between the DLL and the driver? If a DLL (which consists of ring3 programming code) can communicate with a driver (which consists of ring0 programming code), then theoretically I can bypass the DLL altogether, and directly communicate with the driver in my program (my EXE file's programming code). Does anybody here understand the actual mechanics of what's going on here, and how it can be accomplished in an assembly code EXE program? I know it's undocumented, and is unique to each OS, which is why I'm focusing on just one OS Windows 98SE (as it is quite old, but still 32bit, and is most likely to have the least safeguards in place that would otherwise prevent me from doing this). I hope if there are any undocumented internals to Win98SE that would need to be discovered for me to do what I want to do that such internals have hopefully already been uncovered by people reverse engineering Win98SE. If such internal undocumented mechanics of Win98SE have in fact been reverse engineered, I hope somebody on these forums has found the resulting unofficial documentation and can post the link to the documentation here. By the way, just what ring do EXE files run in in Win98SE? It can't be completely ring0, or there would be no need for drivers. It can't be completely ring3, or programs that directly called the IN and OUT opcodes wouldn't work (and I know there are plenty of hobbyist programs that use IN and OUT directly to control the parallel port, to send or receive signals to and from their hobbyist electronics projects). |
|||
30 Sep 2018, 07:09 |
|
Tomasz Grysztar 30 Sep 2018, 07:35
Ben321 wrote: Does anybody here understand the actual mechanics of what's going on here, and how it can be accomplished in an assembly code EXE program? Code: int 20h ; CD 20 dw VMM_PageAllocate ; 53 00 dw VMM_DEVICE_ID ; 01 00 Last edited by Tomasz Grysztar on 30 Sep 2018, 07:44; edited 1 time in total |
|||
30 Sep 2018, 07:35 |
|
revolution 30 Sep 2018, 07:43
Ben321 wrote: By the way, just what ring do EXE files run in in Win98SE? It can't be completely ring0, or there would be no need for drivers. It can't be completely ring3, or programs that directly called the IN and OUT opcodes wouldn't work (and I know there are plenty of hobbyist programs that use IN and OUT directly to control the parallel port, to send or receive signals to and from their hobbyist electronics projects). |
|||
30 Sep 2018, 07:43 |
|
Tomasz Grysztar 30 Sep 2018, 08:05
revolution wrote: ISTR remember that you can run 16-bit com programs in Win9x and all those lovely BIOS interrupts would work as expected. Moreover, even I/O was emulated in V86 mode, for example you could reprogram VGA directly on ports and all that would be emulated in DOS console window. However, if you accessed the same ports in Win32 application, you did reprogram the real thing (the actual graphic card). EDIT: I have ran some tests to verify that I remembered this correctly. It appears so. Last edited by Tomasz Grysztar on 30 Sep 2018, 09:22; edited 3 times in total |
|||
30 Sep 2018, 08:05 |
|
Ben321 30 Sep 2018, 08:50
Tomasz Grysztar wrote:
Hmm. Interesting. Does the INT 20h instruction itself have certain requirements when calling it (such as certain values being stored in the EAX or EDX registers at the time that INT 20h is called)? And what's a VxD service? Last edited by Ben321 on 30 Sep 2018, 08:52; edited 1 time in total |
|||
30 Sep 2018, 08:50 |
|
DimonSoft 30 Sep 2018, 08:51
Ben321 wrote: When I use SetPixel API function in Windows, the SetPixel function itself is in a DLL file, then the internals of that DLL file communicate with the driver somehow (even though they are in separate rings, the DLL is in ring3 with the usermode program that loaded the DLL, while the driver is in ring0 and thus is kernel mode), and then the driver either uses calls to INT (interrupt) or OUT (IO port output) to communicate the pixel data to the actual hardware (the graphics card) in order to set the pixel on the screen to the desired color. The mechanisms behind this are well documented: you can switch between rings by using CPU-provided stuff. Call gates, interrupt gates, trap gates, task gates, list your preferred here as well… Also you can execute a privileged instruction which will cause ring 0 code to run (OS would handle the situation), and this seems to have been the fastest method for quite some time. Besides, there’re newer syscall/sysenter instructions. Not even to mention that in case of graphics the video memory might have mapping to RAM (not sure if they did so in Win98 though) which eliminates the need to use in and out when drawing. Applications in multitasking environment are not expected to have direct access to hardware anyway. What is not documented are the details of which actions performed in Ring 3 cause which consequences. Well, sort of undocumented because some pieces of this information (those that are contractual) are in the DDK documentation for the OS. Details that might change in newer versions of the OS are obviously not documented for good. You can, in theory, bypass a system DLL altogether by doing the same things it does. Thus copying the code that might (and will!) change in the future to your program, thus making your program tied to a particular version of Windows up to the set of updates and service packs installed. Doesn’t seem to be a good idea. There’s a clear separation of concerns: applications ask drivers to perform some actions on devices in terms of application-level stuff, and they live in Ring 3 perfectly; drivers are responsible for translating these requests to actual actions on hardware (as well as combining the resuests made by different applications, if needed), and they might run in Ring 0. Removing the separation should have valid reasons. I doubt your task is the case. |
|||
30 Sep 2018, 08:51 |
|
Ben321 30 Sep 2018, 09:11
DimonSoft wrote:
I could assemble for DOS or V86 in Windows, using another assembler (FASM doesn't compile to 16bit code), and then run it in real mode. But that has its problems. For one, and this is the biggest one, you can't debug it (run each instruction step-by-step) in OllyDbg if it's a DOS EXE or COM file (which run in real 16bit mode). OllyDbg is a REALLY great program for understanding what happens at each step of a program's operation and as far as I'm concerned is like a companion product to FASM. Assemble in FASM and then run the resulting EXE file in OllyDbg, one instruction at a time, to get a feel for what's happening internally in the hardware (memory, stack, CPU registers, etc). It's a great combination of tools FASM and OllyDbg. Problem is OllyDbg doesn't work with 16bit real mode programs like those that would run in DOS. And I will probably need 16bit real mode for best usage of INT, OUT, and IN instructions (if I can't figure out how to use them in a protected mode OS). And now it's these 3 instructions that I'm trying to get a feel for, with the help of OllyDbg. Problem is that these will either require running in real mode (which means OllyDbg can't be used), or I will need to somehow trick Windows into running them in 32bit protected mode. Now to run them in protected mode (especially the INT instruction is giving problems here) I'm going to need to run them in ring0. This has the least restrictions. Therefore I'm going to need to figure out some hacky way of causing my ring3 software to switch into executing in ring0. That way, I can theoretically write all the INT instructions I want in my 32bit code, and not have Windows stop me and say "no no, you can't do that". If I can somehow switch my program into executing in ring0, I could theoretically accomplish what I want, run my EXE in OllyDbg and observe how these INT instructions are actually behaving, which would give me a much better understanding of how this stuff actually all works. The alternative would be to abandon my plan of 32bit ring0 running of my application and instead use NASM (a different assembler, which supports 16bit code) to compile it in 16bit code and run it in 16bit real mode in DOS without the aid of a debugger. Of course that would not help me at all understand what was actually happening as each opcode got executed, because it would be running without a debugger. Now I do have hope of forcing my program into ring0 in Win98SE. It's a very old OS, and I'm sure hackers by now have figured out an exploitable glitch that would allow you to make your code run in ring0, and thus have full ability to execute any opcode that the x86 CPU supports. I'm just hoping that a few of these hackers are here on FASM forums, and can impart their knowledge to me, of how to force my program to run in ring0 in Win98SE. |
|||
30 Sep 2018, 09:11 |
|
Tomasz Grysztar 30 Sep 2018, 09:17
Ben321 wrote: I could assemble for DOS or V86 in Windows, using another assembler (FASM doesn't compile to 16bit code) Ben321 wrote: For one, and this is the biggest one, you can't debug it (run each instruction step-by-step) in OllyDbg if it's a DOS EXE or COM file (which run in real 16bit mode). |
|||
30 Sep 2018, 09:17 |
|
fasmnewbie 30 Sep 2018, 10:24
@Ben21, you seem to be lost, by miles off. You're mixing things up. Maybe you should first understand things like CPU modes, privilege levels etc before interfacing them with an assembler. Understanding the CPU is not the same as understanding assembly programming. For now, you should halt your assembly programming and invest some quiet time to read the Intel/AMD manuals, probably chapters Execution Environment and CPU modes. They are good read, often underestimated by new comers.
|
|||
30 Sep 2018, 10:24 |
|
Ben321 01 Oct 2018, 01:31
Tomasz Grysztar wrote:
Problem is sometimes debuggers (at least with DOS Box's debugger version) have trouble following the code jump from 16bit real mode to 32bit protected mode, and end up landing on the wrong opcode (allowing several opcodes to get executed without debugging, so I can't execute anything step by step and until the debugger finally recognizes where it should be and starts the debugging process again several opcodes past the landing spot for the jump to 32bit protected mode). I did at one point use DOS Box Debugger for my debugger, and had a test program that jumped to 32bit mode from 16bit real mode, and DOS Box Debugger was terrible, so I figured that this was fairly representative of most debuggers in DOS (they just tend to have trouble with the jump to 32bit protected mode, it's part of their nature). So I have basically given up on DOS programming in assembly. Also is "FASM for DOS" a version of FASM that runs in DOS, or a version of FASM that runs in Windows and then compiles for DOS (which then allows you to put your compiled program into a disk image via WinImage, and then load that disk image into something like VirtualBox to use it)? |
|||
01 Oct 2018, 01:31 |
|
Goto page 1, 2 Next < Last Thread | Next Thread > |
Forum Rules:
|
Copyright © 1999-2024, Tomasz Grysztar. Also on GitHub, YouTube.
Website powered by rwasa.