flat assembler
Message board for the users of flat assembler.
Index
> Projects and Ideas > CPU Shader Framework |
| Author |
|
|
bitRAKE 12 Mar 2026, 20:30
Who can afford a new video card? So, play with shaders on your CPU.
Win32 CPU-shader host for experimenting with image shaders, ray-tracing studies, interactive widgets, text rendering, and scene-reduction POCs. https://github.com/bitRAKE/C-CPUShader It isn't meant to be particularly good at anything, but can do many things. Perhaps I'll better align the c-shader language with other shader language(s) - probably not. There are a number of Proof-of-Concepts (POCs): - shader logic reused as a standalone tool window - low-tech fixed-grid SDF text - more advanced MTSDF text rendering - Blender-driven scene reconstruction * this is a tool for programmers - you'll probably need to recompile. ![]() _________________ ¯\(°_o)/¯ AI may [not] have aided with the above reply. Last edited by bitRAKE on 14 Mar 2026, 06:51; edited 1 time in total |
|||
|
|
bitRAKE 14 Mar 2026, 00:23
The start of this work was to produce HDR UI through shaders. You might think HDR isn't a big deal but I can see the different now, and so can you if your monitor supports HDR.
The presentation backend was completely re-engineered because HDR support sucks outside of DX. So, although backends exist for DX12, OpenGL, and Vulkan; they all route though DXGI to produce consistent results. Shader features are expanded: they select the color space and the backend tries to present/capture correctly. Capture frames to PNG - yes, HDR PNG from shader with alpha. New sprite capture POC! Yes, write a shader get a sprite. * Don't trust any performance metrics, yet - kind of on the back-burner. |
|||
|
|
sylware 16 Mar 2026, 11:58
Is HDR just 16bits per color?
|
|||
|
|
bitRAKE 16 Mar 2026, 17:04
scRGB (CCCS), DXGI_FORMAT_R16G16B16A16_FLOAT, Linear
HDR10 (ST.2084), DXGI_FORMAT_R10G10B10A2_UNORM, Non-Linear ... are both considered HDR on the presentation side. HDR10 is weird and requires complex color space conversion, but it's closer to what monitors are actually supporting. I still need to make some more color space tests to ensure round-trip works in the different formats. 1.0f is equal to 80 nits in scRGB. So, although it's linear, monitor range matters in color conversion (if we assume component ranges are [0.0,1.0]). The obvious question to me is: To what extent can conversion be avoided? Can shaders produce the expected color form? _________________ ¯\(°_o)/¯ AI may [not] have aided with the above reply. |
|||
|
|
sylware 16 Mar 2026, 18:56
Then I guess ARGB16161616 is HDR ?
|
|||
|
|
bitRAKE 17 Mar 2026, 13:27
At the application layer.
_________________ ¯\(°_o)/¯ AI may [not] have aided with the above reply. |
|||
|
|
sylware 18 Mar 2026, 12:10
Well, in my wip wayland compositor, the framebuffer actually scanned out is native ARGB16161616 (no color banding). It seems many apps are currently limited to ARGB8888 (I have to blit/convert on the fly).
|
|||
|
|
bitRAKE 18 Mar 2026, 19:29
No monitor actually displays ARGB16161616, though.
For me the better goal is trying to display in a manner that matches the expressiveness of the monitor. The CPU shaders prefer 4 floats because of SIMD. _________________ ¯\(°_o)/¯ AI may [not] have aided with the above reply. |
|||
|
|
sylware 18 Mar 2026, 23:34
Mine is a 10 years old iiyama 1080p monitor, in 16bits color, I do not get color banding (I did visual tests).
|
|||
|
|
Furs 19 Mar 2026, 21:30
sylware wrote: Mine is a 10 years old iiyama 1080p monitor, in 16bits color, I do not get color banding (I did visual tests). |
|||
|
|
HSE 19 Mar 2026, 22:32
Just there is a problem to run GDI
Last edited by HSE on 21 Mar 2026, 22:59; edited 1 time in total |
|||
|
|
sylware 20 Mar 2026, 13:12
Furs wrote:
Then 16bits color encoding is even more appropriate, namely in the worst case scenario it will be done by the monitor hardware. Could not see any color banding or dithering artifacts with my visual tests with full screen 16bits color gradients, "my noze touching the panel" (10years old panel). |
|||
|
|
bitRAKE 21 Mar 2026, 02:37
I need to research this deeper to verify my current understanding: some cheap monitors don't have a lot of processing hardware and lean on the video card more for color processing. Where in this "extra" processing can software intercept?
|
|||
|
|
bitRAKE 23 Mar 2026, 12:13
* Note: I forgot to push prior changes - so, this is a double update, lol.
General AVX2 alpha release binaries and plugin sdk. Recommend building locally for your processor. Major shader collection refactor into plugins. Only three shaders are baked into the executable and the plugins folder is search for DLLs. Any number of directories can be added to the plugin search process. The plugin SDK covers collections and shader support. POCs now also serve as example plugin collections of shaders. Major build refactor with a multi-layer NMake directory based rules to simplify changes. *New* shader collection of monitor diagnostic shaders. Tune your monitor or just understand what technologies look like what. Is it really HDR? More variable shaders showcasing scritable output. Nostalgic HDR shader that pays homage to the classic PNG dice test image. Transparent dodecahedron added to the animation collection - so pretty. Remove trucating or clamping HDR - bad presentation or saving. HDR output is restricted to EXR format. Includes a simple EXR viewer. Currently working on more rigid shader policy to eliminate confusion, corner cases and subtle misuse. |
|||
|
|
sylware 23 Mar 2026, 12:38
bitRAKE wrote: I need to research this deeper to verify my current understanding: some cheap monitors don't have a lot of processing hardware and lean on the video card more for color processing. Where in this "extra" processing can software intercept? Namely the right way would be to be native 16bits color, then some software could include a 'HDR low spec' options, aka software dithering (it is complicated and expensive). You can have a look at the famous Silksong game where you have 2 levels of dithering... and even with that, you still have color banding... yep, I play a few games on elf(glibc)/linux (explaining why I am very sensitive to color banding). (BTW, the steam client is HELL to run, because most valve linux devs are not technically shining... which is very weird). |
|||
|
|
bitRAKE 23 Mar 2026, 16:25
https://www.anyhere.com/gward/hdrenc/hdr_encodings.html
https://en.wikipedia.org/wiki/RGB_color_spaces https://en.wikipedia.org/wiki/ScRGB If you're talking about 16-bit floats then okay - that's what scRGB is using. This is to be compatible with legacy sRGB and remain linear. All my attempts to use integer 16-bit formats resulted in errors in transmission. What is the color-space expected by GIMP/PNG/TIFF? Why does the white balance always blow out the image? I'm assuming my shaders use too wide a gamut to be represented by integers - that's why I'm not supporting anything outside of EXR for HDR until some post-processing features are settled. Presently, it's focused on presentation, on Windows, which is scRGB primarily.
Try "CPU_Shader.exe --backend=gdi --transparent" to render the shader as an overlay in context of other work. _________________ ¯\(°_o)/¯ AI may [not] have aided with the above reply. |
|||
|
|
sylware 24 Mar 2026, 12:18
16bits color encoding is about to have a uniq color per pixel while spannig the whole monitor resolution.
It means real hardware monitor resolution could be up to 65k x 65k. Then 16bits float should be more than fine, but it is more complicated than 16bits unsigned integer. All that said, while working with vector machine instructions like AVX, float vs interger may be important as it may have significant difference in performance. I work with sRGB unsigned integers, which is native to AMDGPU display hardware block. But if my memory is correct the GPU is working on 16bits float. |
|||
|
< Last Thread | Next Thread > |
Forum Rules:
|
Copyright © 1999-2026, Tomasz Grysztar. Also on GitHub, YouTube.
Website powered by rwasa.