man, that’s tempting. thanks for the heads up!
man, that’s tempting. thanks for the heads up!
Serial and audio outs? You got me seriously rethinking my bench…
What kvm if you don’t mind my asking?
I don’t use Wayland for other reasons, but if I did and it broke barrier I’d switch to x11.
Might be worth investigating what you use that is incompatible with x…
Barrier is synergy but no cost
E: It works fine for me across macos, windows and Linux but I don’t use Wayland so that might affect you.
E2 looks like Wayland breaks barrier.
Someone already said “they all already work that way”,
But what are you actually trying to do?
Short answer: no.
Long answer: also no, but in some specific circumstances yes.
Your display uses energy to do two things, change the color you see and make them brighter or dimmer. It honestly speaking has a little processor in it but that sucker is so tiny and energy efficient that it’s not affecting things much and you can’t affect it anyway.
There’s two ways to do the things your display does, one way is to have a layer of tiny shutters that open up when energized and allow light through their red, blue or green tinted windows in front of a light source. In this case you can use two techniques to reduce the energy consumption: open fewer shutters or reduce the intensity of the light source. Opening fewer shutters seems like it would be part of lowering the resolution, it when you lower the resolution you just get more shutters open for one logical “pixel” in the framebuffer (more on that later).
Another way to do what your display does is to have a variable light source behind each tinted window and send more or less luminance through each one. In this case there is really only one technique you can use to reduce the energy consumption of the display, and that’s turning down the brightness. This technique has the same effect as before when you lower the resolution. It’s worth noting that a “darker” displayed image will consume less energy in this case, so if you have an oled display, consider using a dark theme!
So the display itself shouldn’t save energy with a lowered resolution.
Your gpu has a framebuffer, which is some memory that corresponds to the display frame. If that display is running at a lower resolution, the framebuffer will be smaller and if it it’s running at higher resolution it’ll be bigger. Memory is pretty energy efficient nowadays, so the effect of a larger framebuffer on energy consumption is negligible.
Depending on your refresh rate, the framebuffer gets updated some number of times a second. But the gpu doesn’t just completely wipe and rewrite and resend the framebuffer, it just changes stuff that needs it, so when you move your mouse at superhuman speeds exactly one cursor width to the left in one sixtieth of a second, the framebuffer updates two cursor area locations in the framebuffer, the place the cursor was gets updated to reflect whatever was underneath and the place the cursor is gets updated with a cursor on it.
Okay but what if I’m doing something that changes the whole screen at my refresh rate? In that case the whole framebuffer gets updated!
But that doesn’t often happen…
Let’s say you’re watching a movie. It’s 60fps source material, so wouldn’t the framebuffer be updating 60 times a second? No! Not only is the video itself encoded to reflect that colors don’t change from frame to frame and that the thing decoding them doesn’t need to worry about those parts, the thing decoding them is actively looking for even more ways to avoid doing the work of changing parts of the framebuffer.
So the effect of a larger framebuffer on battery is minimized while playing movies, even when the frame buffer is huge!
But actually decoding a 3k movie is much more cpu intensive than 1080. So maybe watch in 1080, but that’s not your display or resolution, it’s the resolution of the source material.
Okay, but what about games? Games use the framebuffer too, but because they aren’t pre-encoded, they can’t take advantage of someone having already done the work of figuring out what parts are gonna change and what parts are. So you pop into e1m1 and the only way the computer can avoid updating the whole framebuffer is when the stuff chocolate doom sends it doesn’t change the whole framebuffer, like those imps marching in place.
But chocolate doom still renders the whole scene, making use of computer resources to calculate and draw the frame and send it to the framebuffer which looks up and says “you did all this work to show me imp arms swinging over a one inch square portion of screen area”?
But once again, chocolate doom takes more computer resources to render a 3k e1m1 than one in 1080, so maybe turn down your game resolution to save that energy.
Hold on, what about that little processor on the display? Well it can do lots of stuff but most of the time it’s doing scaling calculations so that when you run chocolate doom full screen at 1080 the image is accurately and as nicely as possible scaled across the whole screen instead of stuck at the top left or in the middle or something. So in that case you could actually make that little sucker do less work and take up less energy by running at the displays “native” resolution than if you were at 1080.
So when jigsaw traps you in his airport terminal shaped funhouse and you wake up with the exploder on your neck and a note in front of you that says “kill carmack” and no charger in your bag, yes, you will save energy running at a lower resolution.
E: running chocolate doom at a lower resolution, not the display.
The ops stated workload is better on nvidia.
You mainly want to be able to do 3d and video editing right?
Those two, specifically with davinci resolve and blender, work best with nvenc and libcuda(?), the software libraries that let you take advantage of your nvidia cards encoders and cuda cores.
So if you were building for that workload, you’d have an nvidia card and many problems people encounter in Wayland come from using it with an nvidia card.
So yeah it’s the nvidia support. Most people will say “fuck nvidia, just don’t buy their hardware” but it’s the best choice for you and would be a huge help, so choosing between Wayland and nvidia is a no brainer.
It is a bummer that you’ll need to install x specially, but I’d be really surprised if there isn’t decent support for that.
There’s always the hope that Wayland will get better over time and you’ll be able to use it in a few years.
E: a word on encoding: both amd and intel CPU’s have video encode and decode support, but the intel qsv is more widely supported and tends to be faster most of the time. When people suggest intels arc gpus they’re saying it because those gpus use qsv and for a video editing workstation they’d be a good choice.
Part of the reason I put intel and amd cpus on an even footing for you is because any cost savings you get from going amd would likely be offset by the performance decrease. Theres some good breakdowns of cpu encoder performance out there if you want to really dive in, it remember that you’re also in a good place to buy intel because of the crazy deals from sky is falling people.
That kinda ties into the cores over threads thing too. If your computers workload is a bunch of little stuff then you can really make hay of using a scheduler that is always switching stuff around. One of the things that makes amds 3d processors so good at that stuff is that they have a very big cache so they’re able to extend the benefit of multi threading schedulers up to larger processes. You’re looking at sending your computer a big ol’ chunk of work though, so you’re not usually gonna be multithreading with that powerful scheduler and instead just letting cores crunch away.
Part of the reason I didn’t suggest intels arc stuff is that you’re also doing 3d work and being able to take advantage of the very mature cuda toolchain is more important.
Plus nvidia encoding is also great and if you were to pair it with an intel cpu you could have the best of both worlds.
You’re really looking to build something different than most people and that’s why my advice was so against the grain. Hope you end up with a badass workstation.
The op asked for help to make their experience as painless as possible and listed two primary use cases that not only are often related to the problems people encounter with Wayland but function best with hardware that is also related to the problems people encounter with Wayland.
If someone said they need to haul hay I wouldn’t say “try it in your Saturn first and see if it works!” I’d say “make sure you have a truck or a trailer.”
The harm is in setting a person up for failure when they asked for help.
I am not going to fight you on if x is better than Wayland.
The ops use case involves operations, software and hardware that function best with x.
The op should avoid Wayland.
you are getting advice that will make a good gaming pc but not a good workstation for what you said you’re gonna do.
do the opposite of what most everyone in this thread is saying:
intel over amd (this could actually go either way depending on the price point), nvidia over amd, start at 32gb of ram and go up from there. prioritize cores over threads, sneak a rotational hard disk in, spend more on your power supply than you planned to.
plan on not using wayland.
It’s turned up.
Turn it down.
I just saw your second diagram. If all you’re worried about is serveo being able to see, you could do all your http server stuff with https and require ECH on the client side and you’d be okay?
I’m not 100% that would work perfectly, especially if DNS is involved, but I don’t think serveo messes with DNS.
The short answer is: you can’t do this.
The long answer is: you need to go through the process of getting a server you own and have provisioned installed at some colocation/datacenter place. It will be expensive to buy the server, expensive to buy rack space, and you will need to go through significant background and security checks in order to be allowed by the company to do this.
If that sounds terrible, and it is, you can use an overlay network like nebula. It still requires that you have a “server” somewhere, but you can use a $10/yr vps to host that. Your “server” is, in nebula’s terminology, the “lighthouse” node. All it does is punch through nats so people who connect to your overlay vpn are able to see each other.
Your vps provider can still see your data on the lighthouse though, so don’t keep your root certificate on it and use unique credentials. Traffic doesn’t flow through the lighthouse, so you don’t need to worry about snooping, but it’s possible for the vps provider to add themselves to the trusted certificate and get on your vpn. So you have to have good security on your internal network.
Anything will be fine. I’d try a xfce/lxqt desktop, but even on old dual cores the newest kde is good.
Everyone says mint, but suse has a huge German community because it’s from Germany.
Another person said you should upgrade to ssd and maybe add more ram, and I agree with them. Usually I spend $40 to do that to laptops and it makes real dogs run great.
Post the model numbers on the bottom of the laptops and I can give some pre-gifting upgrade advice with actionable links. Both seem to take 2.5” sata ssds so that’s good and cheap, but there’s different models of the aspire es-15 which take different memory sizes.
If you do take the cheap ssd replacement route, give them one of those usb hdd enclosures with the old big rotational hdd in there. They’re like seven bucks and it means they have a place to hold a backup of their data if the gift laptop dies.
Cups-browsed-eez nutz!