C++ Debugging in Eclipse CDT on Linux
This post is a reminder to myself, and maybe help others, how I managed to get Eclipse CDT to work meaningfully with gdb for debugging.
Eclipse – in its default state with CDT – often there is not visibility into vectors and more complicated containers when you’re debugging. You get to see the vectors, with an error that the Internet says “is as designed”. (In a future update, I’ll include the error. But for now, I’m just happy with a solution.)
I think this is due to missing flags (-g3) when compiling. Sorting this out is less important that finishing my college courses though.
The issue seems to be in how you create the C++ project in Eclipse.
To make it work:
(1) In Eclipse, in the Project Explorer, right-click in a blank area, and select New -> Project. (Not C/C++ Project!)
(2) In the “Select a wizard” prompt, In the C/C++ folder, select “C++ Project”.
(3) In the next dialog, select Linux GCC as your toolchain.
Add new cpp source files, or edit the Hello World it gives you, build the project, and debug it, now meaningfully.
An Afternoon Diversion in MacOS 9
To satisfy a curiosity, and itch an itch that never got scratched (Mac G3’s were awfully pretty in their day) I tried installing MacOS9 in a virtual machine.
Lo and behold:
I can’t decide what I find more interesting:
- MacOS 9 had a version of Internet Explorer
- It was this easy to get running under QEMU
- MacOS 9 isn’t particularly intuitive at first blush
I followed https://www.jamesbadger.ca/2018/11/07/emulate-mac-os-9-with-qemu/ ; ignored the parts Mac-specific, used QEMU PPC as available in Debian 12 and used the following as my commandline:
qemu-system-ppc -cpu g4 -M mac99,via=pmu -m 512 -hda macos9.img -cdrom "Mac OS 9.2.2 Universal Inst.iso" -boot c -g 1024x768x32 -device usb-kbd -device usb-mouse
Obviously, substitute in your hdd image file and cdrom install file names.
Dear Computer Display Manufacturers – Anything Other Than LED, Please
For years now, consumer displays have featured LED backlighting. LED backed displays came with advantages – less power consumption, possibility of thinner displays, and longer life.
They were also primarily blue, and therefore definitely appeared brighter on the shelf next to older technologies, and had “more contrast”.
Note: I am not a smart man. I don’t know what goes into the supply chain of current display technologies.
Computer displays are ripe for disruption
I think it is fair to say most office workers have had it with LED lit displays. And, well, any display with poor visual spectral curves.
We don’t like to look into blue crystal balls all day. We don’t like to look at blue things with no recognizable contrast.
There is also a large audience of folks buying displays for better color. That group is much larger than the current gaming crowd.
You don’t have to look far for something better
Go back fifteen years or so. An unused HP LP2065 is a great example – an IPS panel that included a CCFL backlight. The appearance and comfort of the display in it’s default 6500K setting is fantastic.
There are many other examples.
Contrast
There has been much bulahoo written about the dangers of blue light late at night towards sleep.
I think the problem for an average home or office worker isn’t directly blue itself; the issue is the lack of light in the rest of the spectrum. Our worlds are made up of all three primary colors, with broad frequency responses. We understand complex colors in what we see in everyday life – after all, the sun provides this all the time. We know better when to shut off a display that reasonably represents the visible spectrum for our own health. We think more clearly in front of such a display.
When a display’s primary color is blue or when the spectrum is represented by three or fewer sharp shallow response peaks, there is such cognitive load in ingesting the colors that it is a stressor. We fight just to make out what’s displayed.
Having to interpret color is true of every artificial light source; but with displays this is particularly a challenge since we directly look into the light. It is the display that is the sole arbiter of relatable color.
Even though new displays measure better with contrast, the understandable visual contrast is far better with older displays.
.. and older displays are better this way because they much more strongly cover a broad swath of the visible spectrum, even if you were to limit the spectrum comparison to sRGB.
Modern LED displays look like nothing comparatively.
Manufacturers are attempting to improve things
I mean, the manufacturers know there is an issue.
In fact, nearly every new feature can be interpreted as an attempt at “colors here are more like displays of old”.
CCFL isn’t the only option
First, imagine what something like a 32″ 4K IPS display with modern LG tech would look like with a CCFL backlight; it would be glorious.
If a company were dead-set on using LED, then three color LED might work better, if we could get three lamps each with a broad response in it’s specific color, instead of what we get today.
Other tech would work as well – even things like rear projection DLP on the desktop with a lamp, maybe metal halide. Imagine if R&D were spent on rear projection screen tech!
We want a display that is comfortable to use in our homes, jobs, and school on our desks today. Whether that display lasts four years or fifteen I think is irrelevant.
We also don’t want displays that are artificially bright with blue. Just stop it already!
We are adults, crying out for better displays.
Manufacturers, please, for the love of everyone’s eyes and sanity, give us better displays. We will reward you handsomely for it.
Speed of BASIC Arithmetic on a Commodore 64
I grew up with a C64. A relative gave it to me; at the time, it had started going out of fashion for AOL, Prodigy, and all things online.
I learned to love programming because of it, and it started a lifetime of working with computing systems.
I’ve recently rediscovered VICE. Not only are the C64 and C128 emulated, but the C64 with a Creative Micro Designs SuperCPU is emulated as well. I have access to all the things I wanted as a kid but couldn’t afford, on the computer I already have today on my desk.
So I start up the the emulated C64 with SCPU, and lo and behold, it’s awesome. Programming with BASIC on this is on another level; much more comfortable than my memory of it.
Today’s find: for some reason, doing arithmetic on integer variables is slower than doing arithmetic on the default floating point variable type.
As an example:
The difference is 3/60th of a second, or it takes about 10.3% of the time longer to do this arithmetic on an integer variable than a floating point variable.
Maybe the reason BASIC had these included was for memory savings more than speed. Maybe the floating point version is faster because the BASIC interpreter takes more time just decoding the code.
Surprising to me either way, considering the complexity of performing floating point math on a 6502.