• 98 Posts
  • 374 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle
  • Am I likely to be annoyed about where the fiber comes into the house?

    That one depends on the company installing it. When I got it installed they asked me exactly where I wanted the fiber to terminate and ran it through the house to an outlet under my desk. So let them know and they might put it where you need it.

    As for the router, I recommend buying a mini PC with at least 2 Ethernet ports and 4GB of RAM and running OPNsense. It’s great and will give you all the control you need. Or you can repurpose any old PC you have lying around and just add some Ethernet ports on a PCIE card.













  • I think the main take on this is to learn the lesson that it is not safe to install random software you come across online. Is this lesson new, though?

    I think people often have a vaguely formed assumption that plugins are somehow sandboxed and less dangerous. But that all depends on the software hosting the plugin. There was a recent issue with a KDE theme wiping a user’s files which brought this to light. We can’t assume plugins or themes are any less dangerous than random executables.









  • It’s hilarious – and also a bit sad – that Tan and his ilk assume that someone must be paying me to write. They apparently cannot imagine any human motivation beyond money. It does not occur to them that a person could simply be inspired to action because they care about things like community, democracy and truth.

    See also: “if people weren’t under threat of unemployment ruining their lives, they wouldn’t be motivated to work.” Many right-wingers seem to have no conception of being motivated to do something because it’s good to do.






  • Neuronal firing is often understood as a fundamentally binary process, because a neuron either fires an action potential or it does not. This is often referred to as the “all-or-none” principle.

    Isn’t this true of standard multi-bit neural networks too? This seems to be what a nonlinear activation function achieves: translating the input values into an all-or-nothing activation.

    The characteristic of a 1-bit model is not that its activations are recorded in a single but but that its weights are. There are no gradations of connection weights: they are just on or off. As far as I know, that’s different from both standard neural nets and from how the brain works.