Skip Navigation

Posts
13
Comments
306
Joined
6 mo. ago

  • I just read through your article in detail rather than skimming, and what you outline is perfectly mirroring my current sentiments. I just posted a post asking a related question: why build ActivityPub instead of feature rich clients built on-top of existing protocols like RSS/Atom/email which are already built around self-hosting?

    There is something about the server/instance intermediation of ActivityPub that strikes me as antithetical to the goals of distributed social networking, but I do understand the advantages of offloading moderation and server admin to specialists. Instance as nexus for message routing is an interesting way to look at it, I have been thinking of it as Instance as Aggregator, essentially providing group moderation and directory services but offloading all content distribution back to the host/client format of the open web/blogosphere. It would solve the authorship issue among other problems if Syndication was actually just linkback to the original host.

  • LOL, what? My brother, this is like using a microwave an complaining it doesnt brown your pie. Don't blame the pie, you picked the tool.

    As I had to have crammed into my understanding a few times, all ActivityPub posts are in fact the same, they are completely interoperable, it is only the Instance and Client which determine how you can interact with that content.

    If you use your instance webclient as the only way to interact with ActivityPub content, you are choosing to defer the decisions of how you interact with the content to your Instance admin. It's not a FediVerse problem, it's a you problem.

    Also keep on mind many Clients, including Insterstellar and FediLab, allow you to log in to multiple accounts on different instances and switch between them easily.

    If you want more features, use a broader based client, and subscribe to the users/communities on the instance you like to keep continuity of content.

  • This does remind me that I wish that Fediverse clients would have RSS reader functionality built in by default. I have a sneaking suspicion some do and I just don't know how to use the feature. Effectively allowing people to "boost" aka repost with backlink RSS updates on a Fediverse client would enable most of what a blogger would want from the Fediverse, with the exception of receiving all the comments on the posts they share.

    Bridgy does that, but then it is essentially just a mirror so it does have the server inefficiency of redundant hosting built in.

    That you might say is the fundamental design decision of Activity pub, shifting the hosting burden from a single host to a distributed network of server instances. This enables a more robust network, with instances holding content the users have interacted with regardless of if the original host instance goes down. It also reduces time to load for content after it has beed federated to a user's local instance, assuming it is closer in proximity and capable enough. At the same time, this makes content ownership and control a challenge.

    Functionally the Fediverse is a public commons with content ownership practically distributed across the network of instances, whether copyright says so or not. Attempts to impose universal author controls on this framework face a lot of dissonance because it is fundamentally at odds with the underlying concept of federation as distributed hosting. The minute a host begins hosting content over which they have no control (such as encrypted posts) the potential for abuse skyrockets.

    Since the popularization of the Distributed Social Network concept I have wondered whether pre-existing content distribution infrastructure like RSS might not be more advantageous as a backbone for social networking, with the development load entirely shifted to the client side and away from protocols. The IndieWeb project is playing with some of these ideas, and I have seen some prototypes online of RSS based social networks, so my question is, what is the fundamental advantage of ActivityPub over the combination of these other existing protocols with longer histories and broader existing implementation? RSS, email, XMPP, etc. Is lower latency really a good enough justification for widely redundant data distribution?

    This question becomes increasingly relevant when it comes to multimedia, and the minute that you offload multimedia to central servers by link embedding instead of hosting within the instance, boom you are back to the old centralized architecture and why are you federating?

    So I am going to pose this question to the Fediverse myself, what is the reason that federated content distribution should be adopted for general use rather than distributed aggregation? That is to say of a client performed with the same features as a Fediverse front end, but all of the content was self-hosted and listed via RSS or Atom with comments handled via Webmention, direct messages via email or XMPP, and moderation handled at the level of aggregation via instances (meaning a user "joins" or "subscribes" to an instance, and that instance provides a ban list, list of feeds subscribed to by its users for discovery, provides a user directory) what would be the features that this type of system would lack that ActivityPub based systems have in place?

    There are three advantages I see, and I'm not completely sure they justify mass adoption vs. the cost of broad redundancy of content and authorship issues.:

    1. Choosing local instance for faster loading, but this only is an advantage after content is brought in for the first time, in which case it actually is slower as first the instance has to pull the cintent and then serve it to the user.
    2. "all" content in the protocol is of the same type, allowing for easier interoperability between clients and services. I'm thinking this is the root of what most people will say is the big advantage of ActivityPub vs. older protocols, but I'd like to hear more about why this is enough of a reason to overcome the inertia of existing mass adoption and support of the alternatives.
    3. It isn't based in XML, and modern devs don't want to use XML. As I'm not a coder, I cant say how big an influence this has, but from what I have seen it seems to be a substantial factor. Can anyone explain why?
  • Deleted

    Permanently Deleted

    Jump
  • This is stalking. She could get a restraining order which includes interacting with her online content, so if he continued to repost it would be a violation of the order.

  • Yes, but why not disclose that you used AI in writing the article? Failing to follow journal guidelines is the fault here, not using AI for productive purposes.

  • Deleted

    Permanently Deleted

    Jump
  • It's wild to me that Soulseek persists despite being entirely mediated by a single central server. I would have thought it would have gotten the takedown long ago.

    Also the fact that it doesn't swarm, only does one to one peer sharing is kind of odd to me, but I guess it actually makes some sense in that it constrains the network to being more optimal for smaller files like music and so keeps video off of the platform for the most part.

    Worth noting that the Soulseek Wikipedia page lists a bunch of clients you can use, including Seeker for Android and others for all platforms including Linux

  • Something to do with 9/11 and a diagram of Sadam Hussain's hiding place the US military released after he was killed.

  • For ISP blocks like this will proxies or VPN work?

    Also do the sites mirror to darknet?

  • It was always so smooth.

  • XMPP Lives!

  • Very interesting, so just with a tweak to the client you could treat communities as basically (hash)tags instead of forums? I suppose what I'm thinking of amounts to unique tag identifiers that are computer identifiable based on subject matter / content. I know that effectively this is what is going on under the hood of the social graph at the large social media sites, but rather than connecting the content together into transparent collections they instead serve it to individuals through the suggestion engine as part of feeds.

    Something both "spontaneous" and somewhat transparent in at least the grouping/collection is what I think would differentiate the feature, but how to defend it from manipulation is a big question. How do you protect algorithm/AI guided curation from AI guided manipulation seeking to maximize placement of content in as many groups/collections as possible? Even a reputation system could just be used to reenforce more advanced content placement techniques.

    I guess there is always the big shrug, if it is relevant it is relevant.

  • As far as anxiety goes, video/cinema started as and still is primarily a team art/craft. Tap in to your local media production scene and get advice from those already in the trade. Apprenticeship is the classic model, but you don't have to commit to anything. Most people will gladly take an extra hand as a grip or production assistant or just chat over coffee about two they do what they do.

    Trade in some social anxiety for anxiety about the work, it will do you good in the long run, because before you know it you'll need/want something you don't have, be it gear or some experience labor, and having someone to call about it is worth more than a hundred tutorials.

  • Travel is a tough first subject to launch on. If you're serious about giving it a shot I highly suggest you start by treating your local area as your first subject. Treat yourself like a tourist in your area ans script it as though it is your first time visiting.

    Work on your format, putting together shot lists, decide how much you want to scripts ahead vs retroactively, whats your target video length and how are you going to handle shorts.

    Luckily there are a lot of solid how-to video courses in YouTube and elsewhere, so first part of the job will be self education and applying what you are learning directly to your desired subject matter.

    Definitely do not spend your budget on the travel and then just wing it if you're serious about making a channel, give yourself the flexibility of being able to go and reshoot things locally while you learn the tools.

  • The difference is that I'm talking about the automation creating completely new groupings, most akin to a community on Lemmy, that coordinated across multiple users, in my mind "simultaneously" with the user still agreeing to opt in to inclusion in that group.

    There is an alternative way to do this, which would be that the automation groups the posts after posting, however there is a question there about opt-in, will users want to opt existing posts in after the fact?

    One way that definitely would be easiest to implement would be if these groupings are essentially threads with a single piece of content as the "start" / "seed" of the thread and the other posts relating to that thread. Regarding opt-in for that I suppose it could be as easy as enabling/disabling "thread seeding"

  • Yes, the automation datastream would need to be segregated, and probably ephemeral.

    The point is that if you want posts to spontaneously coalesce with some kind of shared Metadata, you want the ML content analysis information of the post to go out before the actual post is published so the final post Metadata can include the "group" tag or whatever you want to call it.

    Alternately you could do it after the fact by editing the post, but that seems like there would probably be some degree of chicken and egg scenario.

    All of this could be done by the client completely independent of post metadata of course, but then how do you make the relation of the posts to each other consistent between multiple users? Is that even a desireable/necessary goal is a question I suppose.

  • Regarding "where content should go", I mean like which community to post a lemmy post in, or which account to post to if a person manages multiple topical accounts (or accounts in different instances specialized for specific services), or whether to format it for loops vs peertube for video, etc.

    Gossip wise, I'm imagining compressed data posted in a format only intended to be ready by the automation systems which happens before the suggestions are made, so that the suggestions can include dynamicly grouping content before publication by appending the relevant metadata/format (like posting in a lemmy community).

  • aha, yes I am guilty I only read the first half of the article. That makes a lot of sense.

  • Don't let anyone tell tou a YouTube Channel will cost $5000 to start. All the basic software tools can be had for free: DaVinci Resolve suite, OBS, etc. Hardware wise you can start with a decent phone camera and $100-$200 lapel mic like Lark and be completely fine, add in a $250 budget for lighting and you'll be ahead of the game.

    Video is 40% ideas/concepts, 3% production, 7% editing, and 50% persistence. Seriously, the main thing is putting in the hours and not stopping. Your stuff will be crap when you start, but as long as you keep putting out content it will get better and you will grow your channel. 10 hr/week, you can do it.

  • I'm going to have to dive into push notification handling, I basically minimize the use of push notifications at my desktop level to only push work related content, and use the notification system of the clients to handle the "recreational" content which leaves me checking lots of platforms separately.

    It makes sense that there should be the ability to create separate profiles with different filters and behaviors at the push notification manager level, I just haven't thought to look into it before.

    Regarding killer apps for ActivityPub, and unified clients, I have a second idea which I didn't want to cloud this thread with that seems somewhat inevitable that will require a central portal with access to all services (and accounts?). That is a single publishing UI where the user creates/uploads any piece of content and then it suggests what venue/service/account to publish it on and related add-ons like hash tags, etc. With the Fediverse the APIs are open and multiplatform publishing clients (like FediPlan) already exist, so a level of light ML/AI for publication seems inevitable.

    The next level of this, and what could be a "Killer App" is spontaneously generated affinity grouping via content aware publishing, meaning that the publishing client not only suggests where the posts should go, but also has a metalayer where the publishing clients instances "gossip" about the content being published and then create brand new "spontaneous" venues to publish that content in alongside other similar content being published by other users. Suddenly your text post about a super-niche interest or problem is pooled with posts by other users on the same topic, and bam you have a relevant discussion group of commenters/posters.

    Problems of course arrise from this re:advertisers/promoters as well as unsavory/harmful mutual interests, but to be honest I think this is more of an inevitability than a possibility, so getting ahead to architect it in a way that minimizes potential abuse before the corpos get on it is probably a good idea.

  • Basically Burkina Faso went out on a limb allowing them to launch the program, either not fully realizing that the mosquitos would eventually cross boarders and make this an international incident or just deciding they didn't care and would do it anyway.

    Burkina Faso is in the process of merging with several other countries to make a federated union in tha Sahel, most likely one of the other nations in the Pact found out about it and objected so they are coming down hard to save face.

    Essentially there is enough skepticism of the technology among the African populace that this is now becoming a third rail issue for politicians. Those countries independent enough to not care what their neighbors think have populations against the technology, and everyone else is internationally minded enough to defer to the objectors.

    That said, pesticide or no, the cat may be well out of the bag at this point, if a few of the test release mosquitoes survive it will only be a matter of time before the genes spread through the population. That's the whole point of the gene drive.