开云体育

ctrl + shift + ? for shortcuts
© 2025 开云体育

Graphics cards and Monitors for the shack/ DXLab


 

+ AA6YQ comments below

An important distinction is that most of the features of these performance graphics cards are not going to be used at all with typical ham applications. Almost nothing in the ham application use 3D graphics. Instead, it's the 2D rendering performance that is key but it's not talked about at all in most card reviews. This is especially important because there is a complete and total mess in the higher end graphics market now which may confused and scare away hams from picking up a very fine graphics card that would meet their needs.

Fortunately there is a web page that will help guide relative speeds of graphic cards, both old and newer ones.



Take note at the G2D column, and sort from highest to lowest. Then just go down that list and check whatever graphic card you have an interest in.

Notice that the fastest card on the list is about 1000 in the 2D speed. The card I have in my main PC (GTX 1060) scores 770 - a card that must be at least 5 years old. It runs a trio of UHD monitors here in the shack. And you can get a similar one for under $100 on ebay.

I'm not advocating this card, rather mentioning that whatever card you are thinking of buying can be judged on it's speed by consulting this list. And anything that's on the top half (500 or more) will most certainly be completely fine for ham applications.

+ It would be nice to have a "Selecting a Video Card" article in the DXLab Wiki to go with the "Selecting a Monitor" article developed by Dave W6DE:



+ The above guidance regarding 2-dimensional rendering performance would certainly be included in "Selecting a Video Card", but there are additional considerations:

- resolution

- refresh rate

- power consumption

- physical size

- monitor connectivity (HDMI, DisplayPort, etc.)

- motherboard compatibility

+ Would someone like to take a crack at this?

73,

Dave, AA6YQ


 

Selecting a video card may be a bit of an oxymoron these days as
the integrated video capabilities on many of the newer systems
are more than adequate. For example, the integrated video in
the Ryzen 7 and Ryzen 9 based "mini PCs" rank in the high 700's
in the videocard benchmark 2D listings and handle 2, 3 or 4
monitors at 4K/120 Hz or better.

The W6DE article on selecting a monitor is valid with the key
points being specific attention to total pixels (H x V) as a
gauge of how much can be displayed at any one time and pixel
pitch (H pixels/screen width) being important to readability
of the smallest fonts (smallest details) within an application.

I found an inexpensive 35" (diagonal - 32" wide) 3440 x 1440
gaming monitor quite satisfactory 95% of the time (the only
nit is being the ability to read some of the smallest fonts, eg.
editing SQL filters in SC - which is easily addressed using the
Windows magnifier [Win-+/Win-Esc]). I am able to display
Commander, Commander Bandmap, DXView (no map), DXKeeper,
SpotCollector, WSJTX and JT Alert at the same time with spare
screen space for Waterfall Bandmap (or Commander Waterfall if
I had a supported radio).

A 43" 4K TV (3840x2160) would have approximately the same dot
pitch as my 35" 3440 x 1440 gaming monitor but would be half
again taller which might lead to "stiff neck syndrome" when
trying to see the top third of the screen.

73,

... Joe, W4TV

On 2025-03-26 2:13 AM, Dave AA6YQ wrote:
+ AA6YQ comments below
An important distinction is that most of the features of these performance graphics cards are not going to be used at all with typical ham applications. Almost nothing in the ham application use 3D graphics. Instead, it's the 2D rendering performance that is key but it's not talked about at all in most card reviews. This is especially important because there is a complete and total mess in the higher end graphics market now which may confused and scare away hams from picking up a very fine graphics card that would meet their needs.
Fortunately there is a web page that will help guide relative speeds of graphic cards, both old and newer ones.

Take note at the G2D column, and sort from highest to lowest. Then just go down that list and check whatever graphic card you have an interest in.
Notice that the fastest card on the list is about 1000 in the 2D speed. The card I have in my main PC (GTX 1060) scores 770 - a card that must be at least 5 years old. It runs a trio of UHD monitors here in the shack. And you can get a similar one for under $100 on ebay.
I'm not advocating this card, rather mentioning that whatever card you are thinking of buying can be judged on it's speed by consulting this list. And anything that's on the top half (500 or more) will most certainly be completely fine for ham applications.
+ It would be nice to have a "Selecting a Video Card" article in the DXLab Wiki to go with the "Selecting a Monitor" article developed by Dave W6DE:

+ The above guidance regarding 2-dimensional rendering performance would certainly be included in "Selecting a Video Card", but there are additional considerations:
- resolution
- refresh rate
- power consumption
- physical size
- monitor connectivity (HDMI, DisplayPort, etc.)
- motherboard compatibility
+ Would someone like to take a crack at this?
73,
Dave, AA6YQ


 

I've followed this thread with some interest. It seems to me that an important feature of computer, monitor, and graphics cards purchases is missing.

I believe one should attempt purchases that reflect more what demands on these devices are likely to be in the future than what they might be in the present. While the future is unknown, it seems very likely that demands for more memory, more sophisticated and faster processors will be the future as they have been in the past. High-end graphics cards are a requirement in the AI world we'll be living in well before you will want to retire today's computer purchase.

Even if you can afford a computer used exclusively for amateur radio you are likely to benefit from buying the? most powerful hardware you can afford rather than buying what's needed for using the software hams are using today.

K4KGG, Larry


 

+ AA6YQ comments below
I've followed this thread with some interest. It seems to me that an important feature of computer, monitor, and graphics cards purchases is missing.

I believe one should attempt purchases that reflect more what demands on these devices are likely to be in the future than what they might be in the present. While the future is unknown, it seems very likely that demands for more memory, more sophisticated and faster processors will be the future as they have been in the past. High-end graphics cards are a requirement in the AI world we'll be living in well before you will want to retire today's computer purchase.

Even if you can afford a computer used exclusively for amateur radio you are likely to benefit from buying the? most powerful hardware you can afford rather than buying what's needed for using the software hams are using today.
+ That's a slippery slope:

1. It's not clear that high-end graphics cards will be a requirement for using AI. Yes, the development and training of Large Language Models (LLMs) currently involves the heavy use of graphics processing units (GPUs) because GPUs can rapidly execute the matrix multiplication operations employed in neural networks. However, exploiting LLMs - which is what hams would likely do - does not require such computation. Furthermore, the DeepSeek LLM demonstrates that the race for market share led many LLM developers to forgo optimizations that enable training with significantly less computation.?

2. With respect to CPUs, plotting available CPU models on a price vs. performance graph generally reveals a suite spot. At the high end, paying an extra 25% to get 10% more performance makes little sense.? Larger CPU caches are worthwhile. More CPU cores will definitely make DXLab applications run faster. As described in the "Hardware Capabilities" section of



SpotCollector alone can make good use of 3 cores. One core for each additional application you expect to consume CPU cycles in parallel is a reasonable rule of thumb. Don't count the DXLab Launcher, as it only consumes significant cycles during upgrades. You won't need 256 cores unless you're going into weather forecasting.

3. With respect to RAM, more is always better. While 32 GB may be more than sufficient today, a motherboard with 32GB that is expandible to 64GB when you need it would be preferable.

4. Secondary storage - whether solid state or rotating - is easy to expand as your needs evolve. The 4 terabyte rotating drive I use for local backup connects to my laptop via USB 3.2 and cost $112! No, it won't meet my needs if I start DXing for dark matter and begin recording and analyzing data from radio telescopes, but likely other upgrades will be required if I head in that direction.

? ? ? 73,

? ? ? ? ? ? ? Dave, AA6YQ


 

On 3/26/2025 1:36 PM, Larry, K4KGG via groups.io wrote:
I've followed this thread with some interest.
So have I, a bit. And a VERY important issue has been missing from the discussion -- RF NOISE! Monitors and their power supplies are a well-known source of noise on our ham bands, and the last thing we need is more of that. This is an issue you're not going to be able to figure out in the showroom or on the internet -- you won't know until you fire it up and poke around it with a battery operated receiver that tunes to HF.

My favorite probe has long been a vintage Kenwood TH-F6A talkie, which has wideband RX from below the AM broadcast band to about 550 MHz, and detectors for AM, FM, and SSB.

If you don't have something like that, an inexpensive Tecsun AM/FM/shortwave receiver will do the job. They use DSP chips designed for consumer radios and their RF performance is amazingly good. My favorite hotel radio, the PL380, sells for about $55. coupled to a roof-top dipole, it is able to receive a weak FM station sandwiched between two much louder ones on the frequencies right next theirs on both sides. That's something that my vintage Technics receiver with a six-gang variable capacitor couldn't do. It sold for $500 25 years ago, and was the best you could buy.

Another issue has arisen with some models of touchscreen monitors. A member of a local club gave me a vintage Samsung with that feature that did flips when he transmitted AND produced lots of noise. I quickly re-gifted it, after figuring out that it couldn't be tamed with ferrites.

My recommendation is to try to choose from something you can buy from a vendor like Costco, who offers no questions asked returns (limited to 90 days for certain products like computers and major appliances). That way, if you get it home and it's a dog you can return it.

If you buy something that's noisy and you can't return it, check out my tutorial on finding and killing noise sources. Two versions, both pdfs. Text is
Slides for talks I've done at Pacificon, Visilia, and to several ham clubs is

73, Jim K9YC


 

+ AA6YQ comments below
So have I, a bit. And a VERY important issue has been missing from the discussion -- RF NOISE! Monitors and their power supplies are a well-known source of noise on our ham bands, and the last thing we need is more of that. This is an issue you're not going to be able to figure out in the showroom or on the internet -- you won't know until you fire it up and poke around it with a battery operated receiver that tunes to HF.

My favorite probe has long been a vintage Kenwood TH-F6A talkie, which has wideband RX from below the AM broadcast band to about 550 MHz, and detectors for AM, FM, and SSB.

If you don't have something like that, an inexpensive Tecsun AM/FM/shortwave receiver will do the job. They use DSP chips designed for consumer radios and their RF performance is amazingly good. My favorite hotel radio, the PL380, sells for about $55. coupled to a roof-top dipole, it is able to receive a weak FM station sandwiched between two much louder ones on the frequencies right next theirs on both sides. That's something that my vintage Technics receiver with a six-gang variable capacitor couldn't do. It sold for $500 25 years ago, and was the best you could buy.

Another issue has arisen with some models of touchscreen monitors. A member of a local club gave me a vintage Samsung with that feature that did flips when he transmitted AND produced lots of noise. I quickly re-gifted it, after figuring out that it couldn't be tamed with ferrites.

My recommendation is to try to choose from something you can buy from a vendor like Costco, who offers no questions asked returns (limited to 90 days for certain products like computers and major appliances). That way, if you get it home and it's a dog you can return it.

If you buy something that's noisy and you can't return it, check out my tutorial on finding and killing noise sources. Two versions, both pdfs. Text is
Slides for talks I've done at Pacificon, Visilia, and to several ham clubs is

+ Good point, Jim. It would be nice to establish and maintain a list of "known quiet" monitors.

+ Before I got my ham license in 1990,I bought a Commodore Amiga to introduce my sons to electronic music. The first time I connected it to my IC-735, there were strong birdies everywhere - on every band. I had to completely enclose the monitor in aluminum foil before I could hear any DX.

? ? 73,

? ? ? ? ? ? Dave, AA6YQ



 

The other argument against buying the absolute high end CPU (more CPU
than necessary for the purpose) is that Moore's Law still holds when
it comes to CPU performance vs. price. By the time the user's needs
even come close to the capability of the CPU the price of that CPU
will drop significantly or newer CPUs will be available with major
advances in capability.

A typical lifetime of computer technology is something on the order
of six years - or perhaps two major versions of Windows. Purchasing
capability beyond that lifetime is a waste of time. At every 6-8
year point I have replaced computers with new systems having four
times the performance (MOps/Sec, RAM, etc.) the previous system at
half the cost of the "obsolete" system.

Today, with systems featuring CPUs at the median ~24,000 CPU Mark
level available for between $350 and $500, it makes no sense to
pay three to four times as much for the highest performance CPUs
that show CPU Marks of ~28,000-29,000.

73,

... Joe, W4TV

On 2025-03-26 5:57 PM, Dave AA6YQ wrote:
+ AA6YQ comments below


I've followed this thread with some interest. It seems to me that an
important feature of computer, monitor, and graphics cards purchases is
missing.

I believe one should attempt purchases that reflect more what demands on
these devices are likely to be in the future than what they might be in
the present. While the future is unknown, it seems very likely that
demands for more memory, more sophisticated and faster processors will be
the future as they have been in the past. High-end graphics cards are a
requirement in the AI world we'll be living in well before you will want
to retire today's computer purchase.

Even if you can afford a computer used exclusively for amateur radio you
are likely to benefit from buying the? most powerful hardware you can
afford rather than buying what's needed for using the software hams are
using today.
+ That's a slippery slope:
1. It's not clear that high-end graphics cards will be a requirement for using AI. Yes, the development and training of Large Language Models (LLMs) currently involves the heavy use of graphics processing units (GPUs) because GPUs can rapidly execute the matrix multiplication operations employed in neural networks. However, exploiting LLMs - which is what hams would likely do - does not require such computation. Furthermore, the DeepSeek LLM demonstrates that the race for market share led many LLM developers to forgo optimizations that enable training with significantly less computation.
2. With respect to CPUs, plotting available CPU models on a price vs. performance graph generally reveals a suite spot. At the high end, paying an extra 25% to get 10% more performance makes little sense.? Larger CPU caches are worthwhile. More CPU cores will definitely make DXLab applications run faster. As described in the "Hardware Capabilities" section of

SpotCollector alone can make good use of 3 cores. One core for each additional application you expect to consume CPU cycles in parallel is a reasonable rule of thumb. Don't count the DXLab Launcher, as it only consumes significant cycles during upgrades. You won't need 256 cores unless you're going into weather forecasting.
3. With respect to RAM, more is always better. While 32 GB may be more than sufficient today, a motherboard with 32GB that is expandible to 64GB when you need it would be preferable.
4. Secondary storage - whether solid state or rotating - is easy to expand as your needs evolve. The 4 terabyte rotating drive I use for local backup connects to my laptop via USB 3.2 and cost $112! No, it won't meet my needs if I start DXing for dark matter and begin recording and analyzing data from radio telescopes, but likely other upgrades will be required if I head in that direction.
73,
Dave, AA6YQ


 

I'm reviewing various graphics cards and seeing many costing well over $300-400.? I'm also seeing various 4K NUC mini computers?with 3-4 video outputs (HDMI) that are apparently 4K or even 8K and advertised?for video production or high end gaming.? Upgrading may not be cost effective, grin.

Rich, NU6T

On Wed, Mar 26, 2025 at 5:00?PM Joe Subich, W4TV via <lists=[email protected]> wrote:

The other argument against buying the absolute high end CPU (more CPU
than necessary for the purpose) is that Moore's Law still holds when
it comes to CPU performance vs. price.? By the time the user's needs
even come close to the capability of the CPU the price of that CPU
will drop significantly or newer CPUs will be available with major
advances in capability.

A typical lifetime of computer technology is something on the order
of six years - or perhaps two major versions of Windows.? Purchasing
capability beyond that lifetime is a waste of time.? At every 6-8
year point I have replaced computers with new systems having four
times the performance (MOps/Sec, RAM, etc.) the previous system at
half the cost of the "obsolete" system.

Today, with systems featuring CPUs at the median ~24,000 CPU Mark
level available for between $350 and $500, it makes no sense to
pay three to four times as much for the highest performance CPUs
that show CPU Marks of ~28,000-29,000.

73,

? ? ... Joe, W4TV

On 2025-03-26 5:57 PM, Dave AA6YQ wrote:
> + AA6YQ comments below
>
>>
>> I've followed this thread with some interest. It seems to me that an
>> important feature of computer, monitor, and graphics cards purchases is
>> missing.
>>
>> I believe one should attempt purchases that reflect more what demands on
>> these devices are likely to be in the future than what they might be in
>> the present. While the future is unknown, it seems very likely that
>> demands for more memory, more sophisticated and faster processors will be
>> the future as they have been in the past. High-end graphics cards are a
>> requirement in the AI world we'll be living in well before you will want
>> to retire today's computer purchase.
>>
>> Even if you can afford a computer used exclusively for amateur radio you
>> are likely to benefit from buying the? most powerful hardware you can
>> afford rather than buying what's needed for using the software hams are
>> using today.
>
> + That's a slippery slope:
>
> 1. It's not clear that high-end graphics cards will be a requirement for using AI. Yes, the development and training of Large Language Models (LLMs) currently involves the heavy use of graphics processing units (GPUs) because GPUs can rapidly execute the matrix multiplication operations employed in neural networks. However, exploiting LLMs - which is what hams would likely do - does not require such computation. Furthermore, the DeepSeek LLM demonstrates that the race for market share led many LLM developers to forgo optimizations that enable training with significantly less computation.
>
> 2. With respect to CPUs, plotting available CPU models on a price vs. performance graph generally reveals a suite spot. At the high end, paying an extra 25% to get 10% more performance makes little sense.? Larger CPU caches are worthwhile. More CPU cores will definitely make DXLab applications run faster. As described in the "Hardware Capabilities" section of
>
>
>
> SpotCollector alone can make good use of 3 cores. One core for each additional application you expect to consume CPU cycles in parallel is a reasonable rule of thumb. Don't count the DXLab Launcher, as it only consumes significant cycles during upgrades. You won't need 256 cores unless you're going into weather forecasting.
>
> 3. With respect to RAM, more is always better. While 32 GB may be more than sufficient today, a motherboard with 32GB that is expandible to 64GB when you need it would be preferable.
>
> 4. Secondary storage - whether solid state or rotating - is easy to expand as your needs evolve. The 4 terabyte rotating drive I use for local backup connects to my laptop via USB 3.2 and cost $112! No, it won't meet my needs if I start DXing for dark matter and begin recording and analyzing data from radio telescopes, but likely other upgrades will be required if I head in that direction.
>
> 73,
>
> Dave, AA6YQ
>









--
Richard Hill


 

hello,

SpotCollector alone can make good use of 3 cores
is SpotCollector multi-processor or multi-core aware?

I have found a link on a very old article here,



where the author writes:

"However, we have run into a few instances where 32-bit applications run with no issues on single processor systems, but fail to run, or perform poorly on multiprocessor systems."

I don't know if this assumption could still be valid. And unfortunately, Microsoft has removed the "Affinity" option in Windows 10 and 11 task manager.

73 de
Salvatore (I4FYV)


 

correction:

the Task Manager still has the Affinity option, but it is accessible only in "Details" view.

Sorry.

73 de
Salvatore (I4FYV)


 

moreover, I have seen that all running processes have affinity set to all CPUs.

73 de
Salvatore (I4FYV)


 

+ AA6YQ comments below
is SpotCollector multi-processor or multi-core aware?
+ I did not design SpotCollector or any other DXLab component to detect and exploit the presence of multiple cores. However, testing revealed that SpotCollector's performance does increase if 2 or 3 cores are assigned - likely due to parallelism in the Jet database engine. If you have SpotCollector configured to direct PropView to generate a propagation forecast for each new Spot Database Entry, then assigning PropView to a core would likely also improve performance,?

+ See the Hardware Capabilities section of



? ? ?73,

? ? ? ? ? ? Dave, AA6YQ


 

Regarding Spot collector hardware needs, I think they relate to the motherboard hardware and not to the graphics card unless the graphics card uses motherboard resources and then a limitation in resources could affect the graphics system and display capabilities (probably on low end systems) when under heavy loads.??

Yes?

Rich, NU6T


On Thu, Mar 27, 2025 at 10:49?AM Dave AA6YQ via <aa6yq=[email protected]> wrote:
+ AA6YQ comments below
is SpotCollector multi-processor or multi-core aware?
+ I did not design SpotCollector or any other DXLab component to detect and exploit the presence of multiple cores. However, testing revealed that SpotCollector's performance does increase if 2 or 3 cores are assigned - likely due to parallelism in the Jet database engine. If you have SpotCollector configured to direct PropView to generate a propagation forecast for each new Spot Database Entry, then assigning PropView to a core would likely also improve performance,?

+ See the Hardware Capabilities section of



? ? ?73,

? ? ? ? ? ? Dave, AA6YQ



--
Richard Hill


 

+ AA6YQ comments below

Regarding Spot collector hardware needs, I think they relate to the motherboard hardware and not to the graphics card unless the graphics card uses motherboard resources and then a limitation in resources could affect the graphics system and display capabilities (probably on low end systems) when under heavy loads.

+ Neither SpotCollector nor any other DXLab application directly utilize graphics card resources. They rely on Windows to display their windows using whatever hardware is provided for that purpose.

73,

Dave, AA6YQ


 

I'm continuing to try to define the graphics card requirements?for ham radio work specifically like the DXLab suite, N1MM, WSJT-X and such.? I am wondering whether a 2D system like AutoCAD might reflect the basic requirements. I plan to use the information below to understand mid level graphics cards through the G2D performance charts on the Video Benchmark site:



For 2026, AutoCAD (Computer Aided Design) recognizes a Basic system as 2 GB GPU with 29 GB/S Bandwidth and Direct X 11 compliance.? A recommended system is 8 GB GPU with 1066 GB/s bandwidth and Direct X 12 compliant.


I've been exploring my Dell 8950.? If you type "dxdiag" into the search bar of the Win 11 Start page, you will get a DirectX Diagnostic tool that shows some of your graphics card's capabilities.? My Intel UHD Graphics 730 "basic" card has Direct X 12, and has a 4GB?total memory..

I needed more information to understand whether this GPU meets the AutoCAD recommendations.? I downloaded a free tool called GPU-Z.? ?It shows me that I have Direct X 12 (2_1), a 128 bit bandwidth?and 140.5 GB/s bandwidth.? I did a Google search to find GPU-Z.


That tells me that my low end GPU meets the Basic requirements for AutoCAD and more, but not the full recommended requirements.? This set of tools could be used to help understand what your system has available.

I don't know if the AutoCAD requirements are relevant to ham radio needs, but it is the best that I have so far.? CAD is heavily 2D and very little?3D.? AutoCAD's minimum resolution requirement is 1080p but higher resolution is desired.? There are additional requirements depending on the variation of AutoCAD.

73
Rich, NU6T


On Tue, Mar 25, 2025 at 11:13?PM Dave AA6YQ via <aa6yq=[email protected]> wrote:
+ AA6YQ comments below

An important distinction is that most of the features of these performance graphics cards are not going to be used at all with typical ham applications.? Almost nothing in the ham application use 3D graphics.? Instead, it's the 2D rendering performance that is key but it's not talked about at all in most card reviews.? This is especially important because there is a complete and total mess in the higher end graphics market now which may confused and scare away hams from picking up a very fine graphics card that would meet their needs.

Fortunately there is a web page that will help guide relative speeds of graphic cards, both old and newer ones.?



Take note at the G2D column, and sort from highest to lowest.? Then just go down that list and check whatever graphic card you have an interest in.

Notice that the fastest card on the list is about 1000 in the 2D speed.? The card I have in my main PC (GTX 1060) scores 770 - a card that must be at least 5 years old.? It runs a trio of UHD monitors here in the shack.? And you can get a similar one for under $100 on ebay.

I'm not advocating this card, rather mentioning that whatever card you are thinking of buying can be judged on it's speed by consulting this list.? And anything that's on the top half (500 or more) will most certainly be completely fine for ham applications.

+ It would be nice to have a "Selecting a Video Card" article in the DXLab Wiki to go with the "Selecting a Monitor" article developed by Dave W6DE:



+ The above guidance regarding 2-dimensional rendering performance would certainly be included in "Selecting a Video Card", but there are additional considerations:

- resolution

- refresh rate

- power consumption

- physical size

- monitor connectivity (HDMI, DisplayPort, etc.)

- motherboard compatibility

+ Would someone like to take a crack at this?

? ? ? ?73,

? ? ? ? ? ? ?Dave, AA6YQ









--
Richard Hill


 

+ AA6YQ comments below
I'm continuing to try to define the graphics card requirements?for ham radio work specifically like the DXLab suite, N1MM, WSJT-X and such
+ Any interface from a reputable supplier that supports your chosen monitor's resolution, refresh rate, and video connector (HDMI or DisplayPort) will be fine.?

? ? ? ?73,

? ? ? ? ? ? ? Dave, AA6YQ


 

开云体育

Perplexity offers this comment about Graphics card memory and system memory:

?

A graphics card primarily uses its own dedicated memory, known as Video RAM (VRAM), for handling graphics-related tasks such as storing textures, shaders, and rendered images

However, in certain situations, a graphics card can utilize the motherboard's system RAM, but this is not a standard or preferred method for several reasons:

--Integrated Graphics: When a system uses integrated graphics (i.e., graphics processing capabilities built into the CPU or motherboard), it relies on system RAM for graphics processing. This is because integrated graphics do not have their own dedicated VRAM

--Shared Memory: Some systems, particularly those with integrated graphics, allow for a portion of system RAM to be allocated as shared memory for graphics processing. This is more common in laptops or systems without a dedicated graphics card

--PCIe Bandwidth: Modern graphics cards can access system RAM via the PCIe interface, but this is generally slower than using VRAM. The PCIe bus allows for data transfer between the GPU and system memory, but it is not typically used for primary graphics rendering due to bandwidth limitations

--Outsourcing to System RAM: In cases where a GPU runs out of VRAM, it might use system RAM as a fallback, but this is inefficient and can lead to performance drops

?

In summary, while a graphics card can technically use motherboard memory under specific conditions, it primarily relies on its own VRAM for optimal performance.? While a GPU cannot directly use system RAM as a substitute for VRAM, system RAM can be utilized in specific contexts through software management or when using integrated graphics. The performance impact of relying on system RAM for graphics tasks is generally significant due to slower access speeds compared to dedicated VRAM.

------------------------------------------------------------------------------------End of Perplexity quote--------------------------------------------------------------

?

In my shack computer I have an old graphics card with 4 Gig of memory.? That amount of memory is not enough to store the images on my 3440 X 1400, 34” monitor (DXLab + WSJTX improved)? and my 21”, 1920 x 1080monitor (DXAtlas + DXMaps).? So, when my video system goes to sleep, upon wake up, it takes an annoying amount of time to refresh the two screens.?

?

On my house office computer, I have a video card with 16 Gig of video memory.? I have a 27”, 2560 x 1440 monitor (4,816,000 pixels) , on that video system, when awakened from sleep the previous images instantly pops up.? I can’t afford the card that has enough video memory to pop-up that many pixels—nor would I buy the computer case big enough to fit that card.

?

When running DXLab and associated Amateur Radio related software, we have a lot of pixels loaded with useful images and text, the rate of change of those pixels is quite slow in comparison to design tools and gaming.

?

Display pop up time is not a meaningful measure.? But it does show the relative speed of update from video memory vs mother-board memory.

?

73,

Dave, w6de

?

From: [email protected] <[email protected]> On Behalf Of Richard Hill via groups.io
Sent: 27 March, 2025 18:10
To: [email protected]
Subject: Re: [DXLab] Graphics cards and Monitors for the shack/ DXLab

?

Regarding Spot collector hardware needs, I think they relate to the motherboard hardware and not to the graphics card unless the graphics card uses motherboard resources and then a limitation in resources could affect the graphics system and display capabilities (probably on low end systems) when under heavy loads.??

?

Yes?

?

Rich, NU6T

?

On Thu, Mar 27, 2025 at 10:49?AM Dave AA6YQ via <aa6yq=[email protected]> wrote:

+ AA6YQ comments below

is SpotCollector multi-processor or multi-core aware?

+ I did not design SpotCollector or any other DXLab component to detect and exploit the presence of multiple cores. However, testing revealed that SpotCollector's performance does increase if 2 or 3 cores are assigned - likely due to parallelism in the Jet database engine. If you have SpotCollector configured to direct PropView to generate a propagation forecast for each new Spot Database Entry, then assigning PropView to a core would likely also improve performance,?

+ See the Hardware Capabilities section of



? ? ?73,

? ? ? ? ? ? Dave, AA6YQ


?

--

Richard Hill