• FQQD@lemmy.ohaa.xyz
    link
    fedilink
    English
    arrow-up
    51
    ·
    6 months ago

    People don’t use FileZilla for server management anymore? I feel like I’ve missed that memo.

    • RonSijm@programming.dev
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      6 months ago

      I suppose in the days of ‘Cloud Hosting’ a lot of people (hopefully) don’t just randomly upload new files (manually) on a server anymore.

      Even if you still just use normal servers that behave like this, a better practice would be to have a build server that creates builds, like whenever you check code into the Main branch, it’ll create a deploy for the server, and you deploy it from there - instead of compiling locally, opening filezilla and doing an upload.

      If you’re using ‘Cloud Hosting’ - for example AWS - If you use VMs or bare metal - you’d maybe create Elastic Beanstalk images and upload a new Application or Machine Image as a new version, and deploy that in a more managed way. Or if you’re using Docker, you just upload a new Docker image into a Docker registry and deploy those.

      • dan@upvote.au
        link
        fedilink
        arrow-up
        4
        ·
        6 months ago

        For some of my sites, I still build on my PC and rsync the build directory across. I’ve been meaning to set up Gitlab or something similar and configure automated deployments.

        • amazing_stories@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          This is what I do because my sites aren’t complicated enough to warrant a build system. Personally I think most websites out there are over-engineered. Example: a Discord friend made a React site that displays stats from a gaming server. It looks nice, but you literally can’t hyperlink to any of the data, it can only be loaded dynamically and only looks coherent on a phone in portrait mode. There are a lot of people following trends (some good trends) but without really thinking about why.

          • dan@upvote.au
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            6 months ago

            I’m starting to like the htmx model a lot. Server-rendered app that uses HTML attributes to configure the dynamic bits (e.g. which URL to hit and which DOM element to insert the response into). Don’t have to write much JS (or any in some cases).

            you literally can’t hyperlink to any of the data

            I thought most React-powered frameworks use a URL router out-of-the-box these days? The developer does need to have a rough idea what they’re doing, though.

        • RonSijm@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          Yea, I wasn’t saying it’s always bad in every scenario - but we used to have this kinda deployment in a professional company. It’s pretty bad if this is still how you’re doing it like this in an enterprise scenarios.

          But for a personal project, it’s alrightish. But yea, there are easier setups. For example configuring an automated deployed from Github/Gitlab. You can check out other peoples’ deployment config, since all that stuff is part of the repos, in the .github folder. So probably all you have to do is find a project that’s similar to yours, like “static file upload for an sftp” - and copypaste the script to your own repo.

          (for example: a script that publishes a website to github pages)

    • ResoluteCatnap@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      6 months ago

      They have bundled malware from the main downloads on their own site multiple times over the years, and even denied it and tried gaslighting people that AVs were giving false positives because AV companies are paid off by other corporations. And the admin will even try to delete the threads about this stuff but web archive to the rescue…

      https://web.archive.org/web/20180623190412/https://forum.filezilla-project.org/viewtopic.php?t=48441#p161487

      • FQQD@lemmy.ohaa.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        6 months ago

        You know what? I didn’t believe you, since I’m using it for a long time on Linux and never had any issues with it. Today, when I helped a friend (on Windows) with some SFTP transfer and recommended FileZilla was the first time I realised the official Downloads page provides Adware. The executable even gets flagged by Microsoft Defender and VirusTotal. That’s actually REALLY bad. Isn’t FileZilla operated by Mozilla? Should I stop using it, even though the Linux versions don’t have sketchy stuff? It definitely leaves a really bad taste.

        • ResoluteCatnap@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          6 months ago

          Yeah, it’s bad. Surprised they’re still serving that crap in their own bundle but i guess some things don’t change.

          Filezilla is no relation to mozilla. But yeah i moved away from it years ago. The general recommendation I’ve seen is “anything but filezilla”. Personally i use winscp for windows, and will have to figure out what to use when i switch my daily driver to Linux.