XSF Discussion - 2022-09-10


  1. Guus

    Is there someone here that can give the website build a push, for me not to have to wait 24h to see if my doap changes made any difference? If so: pretty please?

  2. Guus

    (if not, I'll fake patience)

  3. emus

    Guus: I think it builds ever hour currently

  4. Guus

    ah, that's pretty good. I thought it was 24h. Nevermind then :)

  5. emus

    Guus: if need are more deep discussion ping wurst salat

  6. emus

    cool that you support this approach

  7. Guus

    'support' / 'be pressured into' :)

  8. Guus

    "I want shiny things!"

  9. MattJ

    Guus: ping me if they're not live an hour after you pushed them

  10. Guus

    will do, thanks

  11. emus

    > Guus: > 2022-09-10 08:47 (GMT+02:00) > 'support' / 'be pressured into' :) > "I want shiny things!" one still can have simple limited file

  12. Guus

    To be clear: I've not pushed changes to the XSF website: I pushed changes to Openfire's DOAP file, which is in a different repository (also, these changes aim to fix the rendering issue on the XSF website, but I can't be sure if I fixed it. There is a chance that the changes _were_ picked up, but didn't fix anything).

  13. emus

    I believe thats the setup

  14. MattJ

    Guus: ahh, sorry, I misunderstood. My guess is it probably won't be picked up until there are changes. I'll have a look in a bit.

  15. Guus

    Thanks MattJ. Nothing of this is end-of-the-world important (and I'll be off soon anyways). If you do find the time to give something a push, that'd be great - if not, then I'll see it pop up at some point in the future.

  16. Guus

    (it has been pretty much an hour, and no changes are visible, fwiw)

  17. emus

    wurstsalat:

  18. wurstsalat

    Website builds every 24 h (or manually) deploys every hour. MattJ you can trigger a manual build via github actions tab > workflows > website build > run manually

  19. wurstsalat

    I'll be back on sunday ;)

  20. MattJ

    ๐Ÿ‘

  21. Guus

    Oh, I can apparently request a manual build myself.

  22. Guus

    (is that intended?)

  23. Maranda

    Huho

  24. Maranda

    Feeds went crazy again

  25. Maranda

    (RSS that is)

  26. Maranda

    https://aria.im/_matrix/media/v1/download/aria-net.org/HsAIdQttvvQQrNtZFTrlNgrq

  27. MattJ

    Maranda, how often does that poll?

  28. MattJ

    I accidentally deployed an old version of the site for ~2 minutes

  29. Maranda

    Each 5 minutes iirc

  30. MattJ

    Bad luck

  31. MattJ

    Anyway, sorry!

  32. Maranda

    Indeed

  33. Maranda

    ๐Ÿคฃ

  34. MattJ

    guus.der.kinderen, https://xmpp.org/software/servers/openfire/ :)

  35. MattJ

    Managed to force a redeployment

  36. MattJ

    I have an idea to ensure DOAPs are refreshed periodically even without other website changes

  37. guus.der.kinderen

    Ah, the logo is there now! Thanks! The XEP links still don't render though. ๐Ÿคจ

  38. MattJ

    Okay, it will now pull from DOAPs daily (or whenever website changes are made)

  39. MattJ

    And for clarification: the stuff on Github is not at all connected to actual website deployments

  40. MattJ

    Although it would be nice to have some UI for forcing deployment, it's not trivial to link up

  41. MattJ

    So that stuff is mostly useful for sanity-checking PRs and such

  42. Maranda has to still look at the DOAP stuff for Metronome ๐Ÿ˜”

  43. guus.der.kinderen

    Thanks MattJ!

  44. emus

    Maranda looking forward!

  45. wurstsalat

    Why not take the github build artifact, extract it, copy the nginx config, and be done? Now we have two layers of building the website, and the one on the server isn't transparent for outsiders/me

  46. MattJ

    wurstsalat: automating the locating and downloading of the artifact is not trivial

  47. MattJ

    It requires interfacing with the Github API, permissions and as far as I can see, OAuth (there used to be a way to generate API keys without doing the OAuth dance but I couldn't find it or it only applies to individual accounts and not orgs)

  48. MattJ

    Either way, it's not as simple as "just take the artefact"

  49. thilo.molitor

    MattJ: why not use a github action for this? In an action you automatically have access to the repo and any build artifacts you create...

  50. mathieui

    thilo.molitor: you mean, giving write access on the server to github actions?

  51. thilo.molitor

    For monal we are using a build pipeline built upon github actions to automatically build a new monal release and upload it to apple for every push/merge onto the stable branch...that works great...

  52. thilo.molitor

    mathieui: you don't need write access to the whole server, a script (php, whatever) on the server given a zip file and extracting it to the proper location would suffice...using a github secret to guard it against zip uploads not coming from the github action...

  53. thilo.molitor

    You could even run a github actions runner on a server of your choice and only give that server access to the website upload...that way you don't even need to give github the upload secret...

  54. MattJ

    thilo.molitor: it's a GitHub action that produces the artifact in the first place. It's just a matter of getting the build from Github to the server.

  55. MattJ

    Which works fine via git pull right now

  56. thilo.molitor

    MattJ: usually I try to use push semantics instead of pull...but yes, doing periodic pulls should be fine too :)

  57. MattJ

    I don't see an hourly git pull as something that's broken enough for me to replace it with something more time-consuming and fragile, such as maintaining and securing a CI runner on the machine

  58. MattJ

    I try to do the same, but it's harder

  59. thilo.molitor

    I know

  60. moparisthebest

    Personally I'm avoiding GitHub actions so I don't have to change anything when they inevitably start charging for them

  61. moparisthebest

    Burn me once (travis-ci) shame on you, burn me twice shame on me

  62. MattJ

    Yeah, I don't like dependencies on free CI for the same reason

  63. MattJ

    We had the same with Docker Hub as well already

  64. MattJ

    After more than half a year they did finally approve us as an open-source project

  65. Maranda[x]

    > Maranda looking forward! I have a stack of packed hardware next to my desk at work to do... which is now almost as tall as me ๐Ÿคฆ

  66. Maranda[x]

    > Maranda looking forward! I have a stack of packed hardware next to my desk at work to do... which is now almost as tall as me emus ๐Ÿคฆ

  67. Maranda[x]

    So whenever I manage to dive out of it ๐Ÿ˜…

  68. emus

    Maranda[x]: I understand

  69. flow

    MattJ, in a similar situation, I do a git fetch every 5 minutes (I think even every minute would be fine) and check if the commit of a remote ref has changed. Would that be possible here?

  70. MattJ

    What are we trying to solve? That sounds pretty much like what we're already doing.

  71. flow

    IIRC guus had to wait up to an hour to see the results of this change, reducing that to minutes seems like a good idea

  72. flow

    IIRC guus had to wait up to an hour to see the results of his change, reducing that to minutes seems like a good idea

  73. MattJ

    Guus wasn't actually modifying the xmpp.org repo

  74. MattJ

    So even with a reduced polling frequency, nothing would have happened

  75. flow

    ok, nevermind then

  76. thilo.molitor

    how is the compliance level determined here: https://xmpp.org/software/clients/monal-im/

  77. thilo.molitor

    is this something I'll have to add to my doap file?

  78. Zash

    using https://code.zash.se/compliancer/ which checks the doap for the XEPs mentioned in a compliance suite

  79. Zash

    `<category rdf:resource="https://linkmauve.fr/ns/xmpp-doap#category-client"/>` seems to be missing from a few clients to indicate that it's a clients doap file, no idea if that's why it doesn't show any compliance levels for a few clients or if there's something else

  80. MattJ

    The new renderings are great, but once we get a good handle on representing the compliance levels, we should bury the verbose XEP list a bit further I think (either collapse it by default or move to a second page for technical details)

  81. thilo.molitor

    Zash: I did not find any client having a compliance level (but I did not check all)

  82. Zash

    Wasn't there a plan to have those designed badges?

  83. MattJ

    As it is, it looks like we expect users to read and understand that stuff. It kind of plays into what people *say* using XMPP requires

  84. Zash

    thilo.molitor, https://xmpp.org/software/clients/dino/

  85. thilo.molitor

    Zash: okay, I added the cathegory-client stuff now...MattJ: the site is automatically rerendered every hour, right?

  86. MattJ

    DOAPs will be refreshed daily, unless someone commits a change to the site

  87. thilo.molitor

    MattJ ah okay...do you know which timestamp?

  88. wurstsalat

    thilo.molitor: you can run the Compliancer tool manually as well https://code.zash.se/compliancer/

  89. thilo.molitor

    wurstsalat: I don't know how to run / what dependencies to install to make it run...the makefile outputs an error: $make squish --use-http make: squish: No such file or directory make: *** [GNUmakefile:7: compliance] Error 127

  90. thilo.molitor

    is it this tool over here? https://github.com/LuaDist/squish

  91. thilo.molitor

    never mind, it is and the compliance tester works now :)

  92. Zash

    http://code.matthewwild.co.uk/squish/

  93. thilo.molitor

    for the next compliance suite we should discuss if we really want to require Private XML Storage (XEP-0049), Jingle File Transfer (XEP-0234) and Jingle In-Band Bytestreams Transport Method (XEP-0261) for advanced clients....at leas for Monal I won't ever implement any of these because it's not really required given that we have HTTP Upload and PEP...

  94. singpolyma

    XML storage is a compatibility thing. If you don't have it you may just totally miss data from other clients. Jingle is pretty much essential if you transfer big files

  95. thilo.molitor

    well, all data relevant to modern clients is mirrored to PEP nodes (bookmarks etc.) by modern servers...what important data could I miss if not implementing XML storage?

  96. thilo.molitor

    big files is a valid argument for jingle file transfer, though...

  97. Zash

    thilo.molitor, "complete" XEP-0045 implementation? somehow I doubt this

  98. thilo.molitor

    Zash: the compliance suite says in [footnote 45](https://xmpp.org/extensions/xep-0459.html#nt-idm46436970603904): Support for the Entity Use Cases and Occupant Use Cases is REQUIRED; support for the remaining use cases is RECOMMENDED.

  99. thilo.molitor

    Monal supports the REQUIRED parts (we are still working at the remaining cases, though)

  100. Zash

    Next time you look, you will surely find additional requirements! ๐Ÿ˜›

  101. thilo.molitor

    Zash: haha :D to be honest I did not know if your compliance tester would accept a "partial" there (I changed the doap before I got it to run on my laptop).

  102. thilo.molitor

    But it seems it's accepting a "partial", too...so changing this back to partial is fine with me, even if Monal technically meets the requirements detailed in the footnote...

  103. Zash

    IIRC it's mostly checking for e.g. "removed"

  104. Zash

    Ah, status=planned|wontfix or <until> discounts a XEP, otherwise it's considered Good Enough

  105. thilo.molitor

    interesting...I'll change 0045 back to partial then, that feels better :)

  106. Zash

    It was half joking fwiw. https://xmpp.org/extensions/xep-0453.html doesn't seem to explain 'partial' and 'complete' so it might be up to interpretation anyway.

  107. thilo.molitor

    Zash: partial still feels better :) I changed it to "complete" in the first place only because I did not want the compliance tester to not even give IM Core level to monal...that would have been an understatement ;)