Yo yo!
I’ve been working on making my life more private and need some assistance picking suitable replacement options. Please let me know what you think of my list of if there are any opportunities for improvement! Here’s where I’m at …
Apple Maps -OSMandMaps. Seems like a good option, but it’s not ready out the box. I need to do more tweaking with it. -Magic Earth. Haven’t tested it yet, seems good. But I’m looking for free options first before I dabble with paid stuff.
AI (ChatGPT) -Lumo. Chat is really good. But I understand they are good because they syphon data illegally, so I’m ok “downgrading” when switching AIs. Lump seems pretty good so far. I can tell it’s not as advanced but it will do me fine for what I need. Also, i assume once I pay for lumo pro it will be more “powerful”. -Maple AI. Seems dope, also I like the pay model, pay for what you use over “x” amount of inquiries. Does anyone know how I owledgable/powerful it is? -local AI OR Ollama. These 2 are beyond my knowledge. I don’t understand how I run these on my own server? If you know anything about these please ELI5.
Google Docs -OnlyOffice. Seems like it does everything I want. -cryptpad. Just heard of this today, need to explore more. Seems dope, but it doesn’t have an app? From what I’ve seen definitely a strong contender.
Photo App (I haven’t looked into any of these yet) -Protón Drive. -ente photos. -I’mmich.
Google Drive -protón drive.
Why this when you can run Ollama on your computer.
I just started messing around with ollama. I must be doing something wrong because it’s sooooo slow. I still need to trouble shot it a lil bit before it becomes my main AI.
Buy a better GPU.
I was kinda doing it wrong. When I download the model it was very slow. But now that I’m using the model my inquires are done at an ok speed (10-30sec depending on what I ask)
Interesting that you need stronger GPU instead of CPU? Can you tell I know next to nothing about tech ….
You can run it on a CPU, but you need to buy more RAM. I said GPU because they have way higher core count than CPUs and can tap into system RAM if their VRAM is full.
Gotta be honest, idk what half of the words you just said mean. Core count, vram… still have some learning to do.
My plan is to run ollama on my rig kinda like a server I guess. And then Use my phone to tap into that whenever I need it. From what I researched that seems doable, but will take some set up.