#Introduction: I’m Lucas, bi software developer and Wikimedian from Berlin. I like writing tools for #Wikimedia #Toolforge, mainly in #Python, though I’m also working on a #JavaScript library to use the MediaWiki API; on the projects, I’m mainly active on #Wikidata and #WikimediaCommons. I play the #piano and occasionally live-stream that or post recordings. I’m making my way through #Tolkien’s The History of Middle-earth. I speak German, English and some Portuguese. Nice to meet y’all 🙂
Grüß dich! #introduction
My name is Katharina and my topics are data, politics, and their intersections.
Currently, I work as a data journalist at Bayerischer Rundfunk. Is the hashtag #ddj still a thing?
Previously, I built a #KnowledgeGraph on queer history. Read more at: https://queerdata.forummuenchen.org/en/ #linkeddata #wikidata
Mostly I code in R, the #rstats community is a great, caring group of people. I like to write tutorials 🤷, occasionally I post them on https://katharinabrunner.de/blog
I am excited to share some brief news on the recent progress we made in pushing bibliographic data on all #Arabic #Periodicals published before 1930 and their editors to #Wikidata: we did it!
With a bit of SPARQL one can now browse our data set from https://projectjaraid.github.io/ as a graph (https://tinyurl.com/jaraid-graph), table (https://w.wiki/9rDP) or a map (https://w.wiki/9o3Z).
More detailed descriptions of our effort will follow in the form of blog posts (and potentially a longer thread).
Holding data beyond #HathiTrust, #OCLC and the German #ZDB have not been pushed yet.
#PeriodicalStudies #MultilingualDH #ArabicPeriodicals #Arabic #Ottoman #Mahjar #الصحافة_العربية #DigitalHumanities #dh
#Wikidata currently holds information on almost 20000 periodicals published worldwide before 1930: https://w.wiki/A6s8. But, as one would suspect, quality of data and coverage differs widely between regions.
So, if you work on periodicals, particularly those published outside the Global North and major centres of publication, consider adding your knowledge to Wikidata for others to link to and discover these awesome resources.
#PeriodicalStudies #multilingualDH #CrowdSourcing #DecoloniseKnowledge #DH #DigitalHumanities #LOD #SemanticWeb
We are continuing our dive into field surveys with #Wikidata. This time, we looked at #historians defined as humans whose field of work is https://www.wikidata.org/wiki/Q1066186 or who have authored a work on this topic.
The query https://w.wiki/AYjY shows almost 53066 historians on a map. A person will have as many dots as they have had affiliations. Yet, there are "only" c.34k dots on this map. This is due to affiliations not being listed on Wikidata or not having geo coordinates.
The skewed distribution of the data set is not surprising with high-density clusters in Europe, North America, and Japan.
EDITED for clarity
Last week, I pushed metadata for some 700+ Ottoman Turkish periodicals published mainly between the Young Turk Revolution of 1908 and the end of the empire to #Wikidata. Data is based on Baykal's wonderful index (https://doi.org/10.1163/9789004394889).
Together with the Arabic periodicals added earlier this year, coverage of periodical history beyond English, French or German on Wikidata is pretty good. Thanks to these efforts, English is now severely under represented (percentage of periodicals represented on Wikidata): https://w.wiki/ArRb.
Arabic is the second most prominent language (after English) and Ottoman the ninth. Swedish is a surprising third and, given the difference in the number of speakers, quite astonishing that there were at least c.2750 Swedish newspapers published before 1930 compared to the grand total of c.3000 Arabic titles in the same period.
La siguiente consulta en Wikidata lista instancias en el Fediverso que están en Latinoamérica. Avísame si conoces una instancia que falte en la respuesta a esta consulta.
¿Por qué hago esto? Para visualizar y promocionar los nodos del Fediverso en Latinoamérica.
Re-#introduction: recurring topics here.
#Wikimedia #Wikidata #Wikipedia #MediaWiki #OpenStreetMap #Wikimania #Wikisource #WikiCite #OpenRefine #wiki #Wiktionary #WikiLovesMonuments #Wikibase #Wikiquote
#i18n #L10n #translatewiki.net #Unicode #CLDR #languages
#Copyright #PublicDomain #PubblicoDominio #Copyleft #CreativeCommons #OpenData #UploadFilters #LicenzaLibera #DatiAperti
#InternetArchive #books #biblioteche #library #Koha #KohaILS #GLAM
#WikiTeam #digipres #ArchiveTeam #XSLT
1/4
RE: https://mastodon.social/@janvlug/114483272180796821
En er is een zesde provincie bij op #Mastodon: @provinciedrenthe
Al sinds 2025-10-16, maar toch nog welkom!
Zie hier een kaartje, gebaseerd op #wikidata:
What configuration do #Mastodon accounts use? Yesterday, I shared a small sample size from a German NGOs. Tonight, I conducted a local analysis by obtaining all the Mastodon accounts known by #Wikidata and analysing them. This analysis covers a total of 6,039 accounts. From organisations to people. The only filter is that they have been active in the last year.
79% have discoverability on an active account (4,765/6,039).
- 34% are indexable (2,026/6,039).
- 35% have at least one verified link (2,139/6,039).
- 32% have at least one attached tweet.
These numbers are similar to those of the other lists. In my humble opinion, @MastodonEngineering has a configuration 'problem' for accounts interested in maximising their reach. Perhaps it is a UX problem because users are accustomed to this being the default, or maybe it's because more configuration features have been added and are all off by default. The "easiest" solution would probably be a toggle in the Mastodon settings labelled "I want to maximise my reach", which would activate features and provide prompts indicating that accounts with a description, verified fields or attached toots are more likely to be followed.
In the meantime, I have added this to the German-only Mastodon List tool, and I have built a German web tool for checking your account: https://mastodon-account-checker.playground.54gradsoftware.de/.
(If Anyone is really interested in This tool being multilingual, I'm always happy to take PRs.
Limitations: I used multiple Mastodon servers to get the result faster. This could affect the number. I'm also fairly sure that the verified link is not 100% accurate, as I sometimes get false negatives with my script.
I was inspired by this #2024 post by @stefan : https://stefanbohacek.com/blog/verification-in-the-fediverse/.
If anyone is interested in the 29 MB JSON file containing all the data, please contact me.