Sorry but I can’t think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn’t hate to hear it.

I’m trying to set up a home server for all of our family photos. We’re on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to “prepare” the download. Then you have one week before the takeout “expires.” That’s one week to the minute from the time of the initial request.

I don’t have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn’t let you download the entire archive either, you have to select each file part individually.

I can’t tell you how many weeks it’s been that I’ve tried to download all of the files before they expire, or google gives me another error.

  • Darohan@lemmy.zip
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 months ago

    Just gone through this whole process myself. My god does it suck. Another thing you’ll want to be aware of around Takeout with Google Photos is that the photo metadata isn’t attached as EXIF like with a normal service, but rather it’s given as an accompanying JSON file for each image file. I’m using Memories for Nextcloud, and it has a tool that can restore the EXIF metadata using those files, but it’s not exact and now I have about 1.5k images tagged as being from this year when they’re really from 2018 or before. I’m looking at writing my own tool to restore some of this metadata but it’s going to be a right pain in the ass.

      • Darohan@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        Ooh, might look into that instead, actually. I always love a reason to write myself a little tool, but dealing with Google’s bull makes it much less appealing to me when existing tools can do it for me.

    • seang96@spgrn.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      4 months ago

      There exist tools for this from takeout archives already, its how I migrated to immich.

    • s38b35M5@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      They also don’t always keep the metadata in the same archive (zip or tar) with the pictures they belong with, and that can throw off imports with tools that process Google Takeout archives directly. Its a pretty nasty solution, for real.

      I moved about 140GB to ente.io before they had their newer takeout process, but some destinations can enable third party apps (like rclone) to do cloud to cloud. Nor sure which work best, since I couldn’t go that route myself.

      • Darohan@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Ah great, that could be why a bunch of my photos didn’t get metadata. I’ll look into that, thanks for the tip.