• 0 Posts
  • 378 Comments
Joined 2 years ago
cake
Cake day: September 15th, 2022

help-circle
  • Ah yes, because the blue party are 5.2 Zionist, they think apartheid and ethnic cleansing is the birthright of whites on all days but Sundays, while the green party is 6.8 Zionist, they think apartheid and ethnic cleaning is the birthright of whites including Sundays.

    The blue party actually wanted to dismantle Israel you guys, they were just waiting for the mortgage to be paid off.





  • If I share an IP with 100 million other Signal users

    That’s already not very likely, but ignoring IP, you’re the only one with your SSL keys. As part of authentication, you are identified. All the information about your device is transmitted. Then you stop identifying yourself in future messages, but your SSL keys tie your messages together. They are discarded once the message is decrypted by the server, so your messages should in theory be anonymised in the case of a leak to a third party. That seems to be what sealed sender is designed for, but it isn’t what I’m concerned about.

    daniel sent a user an image…

    Right, but it’s not other users I’m scared of. Signal also has my exit node.

    What you’re describing is (not) alarming (…) Signal’s security team wrote.

    I mean if strangers can find my city on the secret chat app I find that quite alarming. The example isn’t that coarse, and Signal, being a centralised platform with 100% locked down strict access, they well could defend users against this.

    What do you mean when you say “conversation” here?

    When their keys are refreshed. I don’t know how often. I meant a conversation as people understand it, not first time contact. My quick internet search says that the maximum age for profile keys is 30 days, but I would imagine in practice it’s more often.

    Even if we trust Signal, with Sealed Sender, without any sort of random delay in message delivery, a nation-state level adversary could observe inbound and outbound network activity and derive high confidence information about who’s contacting whom.

    That is true, but no reason to cut Signal slack. If either party is in another country or on a VPN, then that’s a mitigating factor against monitoring the whole network. But then if Signal is sharing their data with that adversary, then the VPN or being in a different country factors has been defeated.

    Here’s the blog post from 2017

    I appreciate the blog post and information. I don’t trust them to only run the published server code. It’s too juicy of an honeypot.

    I don’t have any comment on SGX here. It’s one of those things where there’s so many moving parts and so much secret information, and so much you have to understand and trust that it basically becomes impossible to verify or even put trust in someone who claims to have verified it. Sometimes it’s an inappropriate position, but I think it’s fine here: Signal doesn’t offer me anything, I have no reason to put so much effort into understanding what can be verified with SGX.

    And thanks for the audits archive.






  • Running R1 locally isn’t realistic. But you can rent a server and run it privately on someone else’s computer. It costs about 10 per hour to run. You can run it on CPU for a little less. You need about 2TB of RAM.

    If you want to run it at home, even quantized in 4 bit, you need 20 4090s. And since you can only have 4 per computer for normal desktop mainboards, that’s 5 whole extra computers too, and you need to figure out networking between them. A more realistic setup is probably running it on CPU, with some layers offloaded to 4 GPUs. In that case you’ll need 4 4090s and 512GB of system RAM. Absolutely not cheap or what most people have, but technically still within the top top top end of what you might have on your home computer. And remember this is still the dumb 4 bit configuration.

    Edit: I double-checked and 512GB of RAM is unrealistic. In fact anything higher than 192 is unrealistic. (High-end) AM5 mainboards support up to 256GB, but 64GB RAM sticks are much more expensive than 48GB ones. Most people will probably opt for 48GB or lower sticks. You need a Threadripper to be able to use 512GB. Very unlikely for your home computer, but maybe it makes sense with something else you do professionally. In which case you might also have 8 RAM slots. And such a person might then think it’s reasonable to spend 3000 Euro on RAM. If you spent 15K Euro on your home computer, you might be able to run a reduced version of R1 very slowly.


  • Okay. But this method doesn’t address that the service doesn’t need the message to include the sender to know who the sender is. The sender ('s unique device) can with 100% accuracy be appended to the message by the server after it’s received. Even if we trust them on the parts that require trust, the setup as described by the blog doesn’t do anything to prevent social graphs from being derived, since the sender is identified at the start of every conversation.

    If we trust them not to store any logs (unverifiable), then this method means they can’t precisely know how long a conversation was or how many messages were exchanged. But you can still know precisely when and how many messages both participants received, there’s just a chance that they’re talking to multiple people. Though if we’re trusting them not to store logs (unverifiable), then there shouldn’t be any data to cross reference to begin with. So if we can’t trust them, then why are we trusting them not to take note of the sender?

    The upside is that if the message is leaked to a third-party, there’s less info in it now. I’m ignoring the Github link, not because I don’t appreciate you finding it, but because I take the blog-post to be the mission statement for the code, and the blog doesn’t promise a system that comprehensively hides the sender’s identity. I trust their code to do what is described.


  • I think Dessalines most recent comment is fair even if it’s harsh. You should understand the nature of a “national security letter” to have the context. The vast majority of (USA) government requests are NSLs because they require the least red tape. When you receive one, it’s illegal to disclose that you have, and not to comply. It requires you to share all metadata you have, but they routinely ask for more.

    Here’s an article that details the CIA connection https://www.kitklarenberg.com/p/signal-facing-collapse-after-cia

    The concern doesn’t stem from the CIA funding. It’s inherit to all services operating in or hosted in the USA. They should be assumed compromised by default, since the laws of that country require them to be. Therefore, any app you trust has to be completely unable to spy on you. Signal understands this, and uses it in their marketing. But it isn’t true, they’ve made decisions that allow them to spy on you, and ask that you trust them not to. Matrix, XMPP and SimpleX cannot spy on you by design. (It’s possible those apps were made wrong, and therefore allow spying, but that’s a different argument).




  • Your client talks to their server, their server talks to your friend’s client. They don’t accept third party apps. The server code is open source, not a secret. But that doesn’t mean it isn’t 99% the open source code, with a few privacy breaking changes. Or that the server software runs exactly as implied, but that that is moot since other software also runs on the same servers and intercepts the data.


  • We can’t verify that. They have a vested interest in lying, and occasionally are barred from disclosing government requests. However, using this as evidence, as I suggested in my previous comment, we can use it to make informed guesses as to what data they can share. They can’t share the content of the message or calls – This is believable and assumed. But they don’t mention anything surrounding the message, such as whom they sent it to (and it is them who receives and sends the messages), when, how big it was, etc. They say they don’t have access to your contact book – This is also very likely true. But that isn’t the same as not being able to provide a social graph, since they know everyone you’ve spoken to, even if they don’t know what you’ve saved about those people on your device. They also don’t mention anything about the connection they might collect that isn’t directly relevant to providing the service, like device info.

    Think about the feasibility of interacting with feds in the manner they imply. No extra communication to explain that they can’t provide info they don’t have? Even though they feel the need to communicate that to their customers. Of course this isn’t the extent of the communication, or they’d be in jail. But they’re comfortable spinning narratives. Consider their whole business is dependant on how they react to these requests. Do you think it’s likely their communication of how they handled it is half-truths?


  • Used by a bunch of NATO armies isn’t the same as promoted by or made by. It just means they trust Element not to share their secrets. And that blog post is without merit. The author discredits Matrix because it has support for unencrypted messaging. That’s not a negative, it’s just a nice feature for when it’s appropriate. Whereas Signal’s major drawback of requiring your government ID and that you only use their servers is actually grounds to discredit a platform. Your post is the crossed arms furry avatar equivalent of “I drew you as the soyjack”. The article has no substance on the cryptographic integrity of Matrix, because there’s nothing to criticise there.




  • Your data is routed through Signal servers to establish connections. Signal absolutely can does provide social graphs, message frequency, message times, message size. There’s also nothing stopping them from pushing a snooping build to one user when that user is targeted by the NSA. The specific user would need to check all updates against verified hashes. And if they’re on iOS then that’s not even an option, since the official iOS build hash already doesn’t match the repo.