50 million rendered polygons vs one spicy 4.2MB boi
50 million rendered polygons vs one spicy 4.2MB boi
50 million rendered polygons vs one spicy 4.2MB boi
Maybe it's time we invent JPUs (json processing units) to equalize the playing field.
The best I can do is an ML model running on an NPU that parses JSON in subtly wrong and impossible to debug ways
So you're saying it's already feature complete with most json libraries out there?
Latest Nvidia co-processor can perform 60 million curly brace instructions a second.
Finally, something to process "databases" that ditched excel for json!
60 million CLOPS? No way!
Until then, we have simdjson https://github.com/simdjson/simdjson
JSON and the Argonaut RISC processors
Well, do you have dedicated JSON hardware?
Please no, don't subsidize anything Java-Script. It will only make it less efficient.
@Randelung @seaQueue well, i have dedicated JavaScript hardware (https://developer.arm.com/documentation/dui0801/h/A64-Floating-point-Instructions/FJCVTZS)
You don't?
There were XML DOM accelerators for a while. Might still be out there.
Everybody gangsta still we invent hardware accelerated JSON parsing
https://ieeexplore.ieee.org/document/9912040 "Hardware Accelerator for JSON Parsing, Querying and Schema Validation" "we can parse and query JSON data at 106 Gbps"
106 Gbps
They get to this result on 0.6 MB of data (paper, page 5)
They even say:
Moreover, there is no need to evaluate our design with datasets larger than the ones we have used; we achieve steady state performance with our datasets
This requires an explanation. I do see the need - if you promise 100Gbps you need to process at least a few Tbs.
There is acceleration for text processing in AVX iirc
Personally, now that I have a machine capable of running the toolchains, I want to explore hardware accelerated compilation. Not all steps can be done in parallel but I bet a lot before linking can.
Render the json as polygons?
It's time someone wrote a JSON shader.
I just added this to my linked in profile. Thanks!
That just results in an image of JSON Bourne.
JSON Sphere
That is sometime the issue when your code editor is a disguised web browser 😅
No, if you're struggling to load 4.2 mb of text the issue is not electron.
there are simd accelerated json decoders
every day we stray further from god
CPU vs GPU tasks I suppose.
GPU, render my 4.2 MB json file!
I'm afraid I can't do that, Dave
Would you rather have 100,000 kg of tasty supreme pizza, or 200 kg of steaming manure?
Choose wisely.
200kg of steaming manure would be pretty sweet if you had a vegetable garden
Not sure I'd chose to use the word "sweet" here...
Careful, the 100,000 kg of pizza will turn into manure.
I figure I can probably convert about 10 kg into manure before it autoconverts into compost. Which is maybe even a worse problem.
The pizza can be used to feed some people but you really have to go fast and find hungry people
Manure can be sold easily
I have the same problem with XML too. Notepad++ has a plugin that can format a 50MB XML file in a few seconds. But my current client won't allow plugins installed. So I have to use VS Code, which chokes on anything bigger than what I could do myself manually if I was determined.
Time to train an LLM to format XML and hope for the best
Meanwhile, I can open a 1GB file in (stock) vim without any trouble at all.
Formatting is what xmllint
is for.
Just install python and format it. Then
I use vim macros. You can do some crazy formatting with it
You don't need to open a file in a text editor to format it
Someone just needs to make a GPU-accelerated JSON decoder
Works fine in vim
Except if it's a single line file, only god can help you then. (Or running prettier -w
on it before opening it or whatever.)
:syntax off
and it works just fine.
Reject MB, embrace MiB.
Reject MiB, call it "MB" like it originally was.
If you're not aware, it was called MB because of JEDEC when IEC units weren't invented. IEC units were introduced because they remove the double meaning of JEDEC units — decimal and binary. IEC units only carry the binary meaning, hence why they're superior. If you convert 1000 kB to 1 MB then use MB, but in case of 1024 KiB to 1 MiB you should be using MiB. It's all about getting the point across, and JEDEC units aren't good at it.
You've got them confused, MiB is the one misusing metric
It isn't misusing metric, it just simply isn't metric at all.
Rockstar making GTA online be like: "Computer, here is a 512mb json file please download it from the server and then do nothing with it"
Let it be known that heat death is not the last event in the universe
You jest, but i asked for a similar (but much simpler) vector / polygon model, and it generated it.
The obvious solution is parsing jsons with gpus? maybe not...
wow wouldn't guess gpu architecture is compatible with parsing tasks
C++ vs JavaScript
it's more like gpu vs CPU
Given it is a CPU is limiting the parsing of the file, I wonder how a GPU-based editor like Zed would handle it.
Been wanting to test out the editor ever since it was partially open sourced but I am too lazy to get around doing it
That's not how this works, GPUs are fast because the kind of work they do is embarrassingly parallel and they have hundreds of cores. Loading a json file is not something that can be trivially parallelized. Also, zed use the gpu for rendering, not reading files.
i hate to break it to you bud but all modern editors are GPU based
As far as my understanding goes, Zed uses the GPU only for rendering things on screen. And from what I've heard, most editors do that. I don't understand why Zed uses that as a key marketing point
To appeal to people who don't really understand how stuff works but think GPU is AI and fast