Smokeydope

joined 2 years ago
[–] Smokeydope@lemmy.world 3 points 1 day ago* (last edited 1 day ago)

I've pumped at least a few hours into at least 90% of the games I own, gave each entry a fair shake, and either moved on or come back to it now and again. Its the rare exception that the game is so up my alley I can pour endless hours into it without the experience getting boring.

I don't care about achievements or even completing it. I play games mainly to try out a new experience and get the brain working in different ways. Some of my favorite games I can pump many hours into and have completed, rougelikes especially are infinitely replayable. But others are once and done experiences I got my fill of over the course of a couple of hours and have no desire to come back to. Theres no shame in being the later.

[–] Smokeydope@lemmy.world 1 points 2 days ago (1 children)

I always wondered why Alien UFOs wouldn't hide in plain sight as an airplane except for the fact that the blinking lights would surely alert air control that an unregistered plane was in air space right? But I dunno its an interesting idea and your story kind of aligns with the concept

[–] Smokeydope@lemmy.world 4 points 4 days ago (1 children)

What is 'the internet' to you? I think this term means different things to different people. I imagine to people born in the latest generations the internet is social media and productivity corpo sites. To them the internet is youtube, tiktok, twitter, reddit, their bank, and whatever slop services they subscribe to magically beamed into pocket computer through technomagical nerd shit like "5g" and processed through "microprocessors" and other stuff they' dont care to really understand because its all abstracted away.

I was born early enough for the internet to be nothing more than two computers barely powerful enough to run a GUI calling eachother up through telephone wires to share goofy web 1.0 blogspam. I remember when low res images were the norm and when pre-google youtube was just coming into being. When AOL and Myspace and Newgrounds/flash games. I remember being a kid and loving computers because I never knew what new cool website was on the horizon to discover and play with. I remember that people used things like newsgroups and pre-craigslist to meet up for transactions.

This is the internet, to me. At least what it once was and what it can be again. People using the digital landscape to freely express themselves with their own hardware. To come together to share in hobbies and interest and passions.

We could have that again if we all bought into a standardized radio based mesh network that could host personal sites while acting as a routing node.

But I don't know if the general public will ever be pushed to partake in this network. They would have to be squeezed very hard to try alternatives to the common way of things.

[–] Smokeydope@lemmy.world 4 points 4 days ago* (last edited 4 days ago) (2 children)

gboard

If you haven't considered it already, please consider switching to a open source keyboard that doesn't collect all your typing data like heliboard

[–] Smokeydope@lemmy.world 1 points 5 days ago (1 children)

Those pillows look like fake velvet microplastic fiber in that shot. If so, please consider switching your bedsheets and pillowcases out for natural fibers like cotton, hemp, or linen. Breathing in those fibers isn't good for you or your cats.

[–] Smokeydope@lemmy.world 42 points 5 days ago (2 children)

Gotta get the bakugan balls in there too

collapsed inline media

[–] Smokeydope@lemmy.world 2 points 1 week ago* (last edited 1 week ago) (1 children)

Its more related to limits of knowability of events beyond a certain scale. Its easy an intuitive to think of it like spacetime is quantized like pixels on a grid with a minimum action requirement of time and energy to move between them. But its not that simple or at least that kind of granular discreteness is not proven (though there are digital physics frameworks that treat spacetime discrete like this)

The Planck length does not define the minimum distance something can move but rather the minimum scale of meaningful measurement that can make a bit of distinction between two microsstates of information. In essence it says that if theres two continuous computational paths that differ by less than a sub-plancks worth of distinction there is no measurable distinction difference between them and the paths get blurred together.

Its a precision limit that defines how exact we can measure interactions that happen within the distance between two points.

It's possible that spacetime is continuous at a fundamental level, but the Planck length represents the scale at which quantum fluctuations of spacetime itself become so violent that the concepts of a 'path' or a 'distance' can no longer be defined in the classical sense, effectively creating discrete quantized limits for measurement precision.

Ultimately this precision bound limit is related to energy cost to actualize a measurement from a superposition and the exponetial increase in energy needed to overcome uncertainty principle at smaller and smaller scales. The energy required to actualize a meaningful state from a sub-planck length would be enough to create a kugelblitz black hole made from pure condensed energy.

This same logic applies to time, giving us the Planck time, the shortest meaningful interval. So, in a way, the Planck scale does define a fundamental limit on the 'speed' at which distinguishable events can occur.

[–] Smokeydope@lemmy.world 1 points 1 week ago (3 children)

Is the speed of causation propagation linked to plank length?

Yes, more specifically the Planck length is derived from an equation involving the speed of light/causality.

collapsed inline media

Where C is light, h is reduced planck constant, and G is gravitational constant. Together they tell us the fundamental unit length of meaningful distinction, a very important yard stick for measuring the smallest distances.

[–] Smokeydope@lemmy.world 2 points 1 week ago (1 children)

I have no religious beliefs. The thing that trips me up is how is there matter in the first place if none can ever be created? Why was there stuff at a single point at some time

The "matter/information can't be created or destroyed" thing only applies to closed systems within their own operational bounds. It's about logical consistency within a closed set, but that tells us nothing about where the closed set itself came from. All the energy from the big bang/first universal iteration was loaned from somewhere else. The how and why of this is probably going to remain a mystery forever because our simulations of the laws of physics can't go back to before the big bang.

So the nature of the big bang and why anything exists is one of the big open-ended philosophy-of-science questions that there isn't an easy falsifiable answer to. It's up to interpretation. I have my own theories on the topic but any guess is as good as another.

From the good old classic "Because God Did It™" to "bubble universes that foam out from a hyperdimensional substrate with random laws of physics/math that sometimes allow for observation and life" and everything in between. It's all the same to me because we can't prove anything one way or the other.

[–] Smokeydope@lemmy.world 12 points 1 week ago (1 children)

What your asking directly stems from two related open ended philosophy-of-science questions. These would be " Are universal constants actually constant?" and "Does the speed of light differ in speed at any point of time in its journey between two points of space in a continuous substrate?"

The answer to both like all philosophy questions is a long hit on the pot pipe and a "sure man, its possible but remains unlikely/over engineering the problem until we have justification through observing it" however I'll give my two cents.

"" Are universal constants actually constant?" " it probably depends on the constant. Fundamental math stuff that tie directly into computations logic and uncertainty precision limits like pi are eternal and unchanging. More physics type constants derived from statistical distribution like the cosmological constant might shift around a little especially at quantum precision error scales.

The speed of light probably is closer to the first one as its ultimately about mathematically derived logical boundaries on how fast any two points universe can interact to quantize a microstate. Its a computational limit and I don't see that changing unless the actual vaccum substrate of spacetime takes a sudden phase shift.

"Does the speed of light differ in speed at any point of time in its journey between two points of space in a continuous substrate?"

Veritasium did a good video about this one. The answer is its possible but currently unmeasurable . so if all hypothesis generate the same effective results then the simplest among them (light maintaining a constant speed during both ways of trip) is the most simple computationally efficient hypothesis among them.

[–] Smokeydope@lemmy.world 26 points 1 week ago* (last edited 1 week ago) (4 children)

Do you really believe that in all of eternity, we happen to be just four and a half billion years in? We are probably on our infinite life, and have infinite more to go. Just completely random lives, no idea where we will end up, nothing persists.

Yes I do, though must clarify its the earth that is estimated 4.5 billion. the universe itself is currently estimated at 13.8 billion years since big bang.

There's a difference between the philosophical idea of an eternal process of cosmological rebirth, and the experimentally observed behaviors of the current universe we live in captured with our most powerful instruments and our best mathematical models.

In the 20th century we built telescopes powerful enough to see into the very distant universe and track the movement of galaxies. Because of this technological achievement we observed some strange things.

First was that galaxies seemed to be moving further and further away from each other. Not only that, they were moving away at an accelerating pace. This uncovered the idea of cosmological expansion, that over time our universe "spreads out" and creates new space between already distant objects.

Second, because the speed of light is finite, this creates fundamental limits to how far we can observe (the cosmological horizon) and a crazy cool phenomenon where the further you look into the distant universe the further back in time you look due to the age of the light from the star and the distance it traveled. We can literally see how the universe looked billions of years ago and calculate how far back we are looking.

If you look back far enough with extremely low frequency radio telescopes you can map out the thermal radiation from when the universe was extremely hot and dense about 380,000 years after the Big Bang. This is called the Cosmic Microwave Background. It shows the universe was in a very condensed high energy state.

Third, we have concepts such as the second law of thermodynamics that says entropy increases in closed systems. Energy always spreads out and systems tend toward disorder on a global level. We have equations that very accurately describe this distribution.

With these breakthroughs we had enough data to simulate accurate matter distributions of the current universe, observe and accurately model matter distributions in the distant past, and use that model to find a best prediction of what may happen in the future with what we currently know. All three lines of evidence point to a universe that is roughly 13.8 billion years old with a definite beginning and end state.

This can still be reconciled with spiritual beliefs if your willing to redefine eternity to something more like an eternal cycle of rebirth with the heat death of one universe bootstrapping the creation of the next iteration. You may enjoy Futuramas bit on it.

 

Please help me understand what went wrong and how I can fix it?

 

This is a simple toolchain that allows you to focus on writing your website instead of getting distracted with HTML formatting.

collapsed inline media

It works by taking in a gemtext file and converting it into an HTML file.

Gemtext:

collapsed inline media

HTML:

collapsed inline media

Code can be found here on the public Git:

https://codeberg.org/TomCon/gem2web/src/branch/master

 

I think i've discovered something important in the field I dabble in as a advanced hobbyist. Like this was a breakthrough and perspective shift enough for me to stay awake all night into the morning until I had to go to sleep testing it works and boilerplating the abstract paper. I constructed a theoretical framework, practical implementation, and statistically analyzed experimental results across numerous test cases. I then put my findings into as good a technical paper as I could write up. I did as much research as I could to make sure nobody else had written about this before.

At this point though I don't really know how to proceed. Im an outsider systems engineer not an academic, and arXiv requires you be endorsed/recognized as a member of the scientific community with like a college email or written recommendation by someone already known. Then whenever I look at the papers on arxiv they always look a very specific way I cant get with libreoffice writer. Theres apparently a whole bunch of rules on formatting and font and style and this and that. Its overwhelming and kind of scary.

So. What do i do here? I have something I think is important enough to get off my ass and get in touch with a local college to maybe get a recommendation. I'd like to have my name in the community and contribute.

 

I think i've discovered something important in the field I dabble in as a advanced hobbyist. Like this was a breakthrough and perspective shift enough for me to stay awake all night into the morning until I had to go to sleep testing it works and boilerplating the abstract paper. I constructed a theoretical framework, practical implementation, and statistically analyzed experimental results across numerous test cases. I then put my findings into as good a technical paper as I could write up. I did as much research as I could to make sure nobody else had written about this before.

At this point though I don't really know how to proceed. Im an outsider systems engineer not an academic, and arXiv requires you be endorsed/recognized as a member of the scientific community with like a college email or written recommendation by someone already known. Then whenever I look at the papers on arxiv they always look a very specific way I cant get with libreoffice writer. Theres apparently a whole bunch of rules on formatting and font and style and this and that. Its overwhelming and kind of scary.

So. What do i do here? I have something I think is important enough to get off my ass and get in touch with a local college to maybe get a recommendation. I'd like to have my name in the community and contribute.

 
 

I now do some work with computers that involves making graphics cards do computational work on a headless server. The computational work it does has nothing to do with graphics.

The name is more for consumers based off the most common use for graphics cards and why they were first made in the 90s but now they're used for all sorts of computational workloads. So what are some more fitting names for the part?

I now think of them as 'computation engines' analagous to a old car engine. Its where the computational horsepower is really generated. But how would ram make sense in this analogy?

 

Setting up a personal site on local hardware has been on my bucket list for along time. I finally bit he bullet and got a basic website running with apache on a Ubuntu based linux distro. I bought a domain name, linked it up to my l ip got SSL via lets encrypt for https and added some header rules until security headers and Mozilla observatory gave it a perfect score.

Am I basically in the clear? What more do I need to do to protect my site and local network? I'm so scared of hackers and shit I do not want to be an easy target.

I would like to make a page about the hardware its running on since I intend to have it be entirely ran off solar power like solar.lowtechmagazine and wanted to share technical specifics. But I heard somewhere that revealing the internal state of your server is a bad idea since it can make exploits easier to find. Am I being stupid for wanting to share details like computer model and software running it?

 

Setting up a personal site on local hardware has been on my bucket list for along time. I finally bit he bullet and got a basic website running with apache on a Ubuntu based linux distro. I bought a domain name, linked it up to my l ip got SSL via lets encrypt for https and added some header rules until security headers and Mozilla observatory gave it a perfect score.

Am I basically in the clear? What more do I need to do to protect my site and local network? I'm so scared of hackers and shit I do not want to be an easy target.

I would like to make a page about the hardware its running on since I intend to have it be entirely ran off solar power like solar.lowtechmagazine and wanted to share technical specifics. But I heard somewhere that revealing the internal state of your server is a bad idea since it can make exploits easier to find. Am I being stupid for wanting to share details like computer model and software running it?

 

So its been almost 10 years since i've swapped computer parts and I am nervous about this. Ive never done any homelab type thing involving big powerful parts, just dealt with average mid range consumer class parts in standard desktop cases.

I do computational work now and want to convert a desktop pc into a headless server with a beefy GPU. I bit the bullet and ordered a used P100 tesla 16gb. Based on what im reading, a new PSU may be in order as well if nothing else. I havent actually read labels yet but online info on the desktop model indicates its probably around a 450~ watt PSU.

The P100 power draw is rated at 250 W maximum. The card im using now draws 185 W maximum. Im reading that 600W would be better for just-in-case overhead. I plan to get this 700W which I hope is enough overhead to cover an extra GPU if I want to take advantage of nvidia CUDA with the 1070ti in my other desktop.

How much does the rest of the system use on average with a ryzen 5 2600 six core in a m4 motherboard and like 16gb ddr4 ram?

When I read up on powering the P100 though I stumbled across this reddit post of someone confused how to get it to connect to a regular consumer corsehair PSU. Apparently the p100 uses a CPU power cable instead of a PCIE one? But you cant use the regular cpu power output from the PSU. Acording to the post, people buy adapter cables with two input gpu cables to one output cpu cable for these cards.

Can you please help me with a sanity check and to understand what i've gotten myself into? I don't exactly understand what im supposed to do with those adapter cables. Do modern PSUs come with multiple GPU power outputs/outlets from the interface these days and I need to run two parallel lines into that adapter?

Thank you all for your help on the last post im deeply grateful for all the input ive gotten here. Ill do my best not to spam post with my tech concerns but this one has me really worried.

collapsed inline media
collapsed inline media

collapsed inline media

 

Do I need to worry about upgrading motherboard with GPU if its old or will it work okay just buying a new GPU?

 

I have a memory foam matress on top a cot. Every now and then I need to sun dry the mattress and cot from a decent amount of moisture trapped between the two. Is there a way to keep the moisture out or even just reduce it?

view more: next ›