User talk:Underlying lk

From Wikidata
Jump to navigation Jump to search

The programmer of Grand Theft Auto V

[edit]

I wondered why would your bot think Grand Theft Auto V was developed by the Develop magazine, and the only idea I got was that the English article cites the magazine in its references, using the proper citation templates, with <code>publisher=''[[Develop (magazine)|Develop]]''</code>. Which is obviously a completely different thing than Template:Infobox video game’s publisher=. I can only hope your bot did not make the same mistake in too many items… --Mormegil (talk) 08:16, 30 April 2014 (UTC)[reply]

@Mormegil: it's not much of a mystery, the bot extracts the first valid interlink it comes across in the given parameter, which in this case was Develop (Q844746).--Underlying lk (talk) 03:21, 1 May 2014 (UTC)[reply]

UnderlyingBot and BC(E)

[edit]

In this edit, the bot forgot the BC when setting the date of birth. Thanks Haplology (talk) 02:10, 2 May 2014 (UTC)[reply]

This also happened with some ancient Chinese kingdoms, Wei (Q912052) and Han (Q1574130) at least. --BurritoBazooka (talk) 17:59, 28 September 2015 (UTC)[reply]

Would you mind if you make a script for me?

[edit]

hello. thank you for contributing wikidata. I'm member of Wikidata:Anime_and_Manga_task_force so I want to coutribute about it. but, There are too many things to do. I'd like make a articles like K-On! (Q482482). I think Infobox animanga links one beings.

example : en:K-On! at english wikipedia.

  • Infobox animanga/Header : general infomation ==> Q482482
  • Infobox animanga/Print : about original manga ==> Q482482
  • Infobox animanga/video :anime season 1 ==> Q15863567
  • Infobox animanga/video : anime season 2 ==} Q15863577
  • ...

Original work (in this case, Manga item has interwiki) has interwiki.

I'd like to do this work so I tried to make a script but failed. Would you mind if you make a script to do this, please?--Konggaru (talk) 10:22, 2 May 2014 (UTC)[reply]

@콩가루: Hi Konggaru, if I understand your request correctly what you need is a script that will add publisher = [[Houbunsha]] to K-On! (Q482482), publisher = [[Sega]] to K-ON! Hōkago Live!! (Q860047), etc. This is possible, but since a script has no way to 'guess' that the item ID for anime season 1 is Q15863567 or that the item ID for the game is Q860047, the id would have to be added manually each time. I created a similarly semi-automated pywikibot script, you will find it here and it depends on a helper script that should be moved to the 'pywikibot' folder.--Underlying lk (talk) 03:01, 3 May 2014 (UTC)[reply]
Thank you for developing it. but when i tried to run it, it failed.

Item ID for 'tv series'? Q16747052 ==> i made new item manually and input it but it doesn't work. and.. Would you mind if you add new function? I think all item involved with original work should have same genre and main subject (P921)(if possible). Always thank you and sorry for annoying you. --Konggaru (talk) 15:34, 4 May 2014 (UTC)[reply]

@콩가루: Starting with the first error: this happened because 'ona' does not have a title. I have changed the script so that if 'title' is missing it will use the article's name instead (as in Oreimo (Q16747333) for the 'ona'). The second problem had the same cause, it should be fixed if you see Is the Order a Rabbit? (Q16747052). Also, I've made a couple of changes to import genre (P136) and main subject (P921) from the main item, but only for videos since games would have a different genre for example.--Underlying lk (talk) 17:42, 4 May 2014 (UTC)[reply]

Thank you. but Would you mind if you add new function? "Part of" property value of original work have all of media involved with original work like K-On! (Q482482). please add this function. Really THANK YOU. your script works very well. --Konggaru (talk) 05:11, 5 May 2014 (UTC)[reply]

@콩가루: I made another change so that the script will add part of (P361) instead of based on (P144).--Underlying lk (talk) 06:26, 5 May 2014 (UTC)[reply]
I think based on (P144) is better than P361. anyway, Thank you. I'm using your script well. It works fine. but.. would you mind if you make new function that add all adaptation to original work? eg: add Yuruyuri ova, yuruyuri anime to Original work like this [1] --Konggaru (talk) 15:30, 10 May 2014 (UTC)[reply]
@콩가루: I might have misunderstood your request at 05:11 of 5 May, weren't you asking you use part of (P361) instead of based on (P144)? I can change it back to based on (P144), but how did you want to use part of (P361)? Also, I've made a change to add has part(s) (P527), the script is here. I haven't tested it extensively so let me know if there are problems, as always.--Underlying lk (talk) 23:20, 10 May 2014 (UTC)[reply]

Importing data from a list

[edit]

Hi Underlying lk, I have a list of values that I'd like to import in Wikidata. Do you know if there is an existing script which can be easily adapted to do it, or should I submit a bot request? Thanks in advance. — Ayack (talk) 14:53, 7 May 2014 (UTC)[reply]

@Ayack: I don't have any scripts for this, but I could probably make one. If you had a page generator or a list of pages instead of a list of item IDs it would be better though, so that imported from Wikimedia project (P143) could be added to all claims instead of importing them without source.--Underlying lk (talk) 07:35, 9 May 2014 (UTC)[reply]
@Ayack: I created a little script for this: sycamore.py. It depends on another script, wdhelper.py, being located in the core/pywikibot folder. I tested it only once so if problems should arise don't hesitate to drop me a message. As I said above, this import will not add any sources, it would be better if one could be found for the data.--Underlying lk (talk) 09:38, 9 May 2014 (UTC)[reply]
Wow, thanks a lot! I can't test it now but I'll try later. In fact, I have a source. If you could add stated in (P248): Sycomore (Q15271528) it would be great! — Ayack (talk) 09:46, 9 May 2014 (UTC)[reply]
@Ayack: Updated to add the source. See this test edit: [2].--Underlying lk (talk) 10:21, 9 May 2014 (UTC)[reply]
Thanks! — Ayack (talk) 10:28, 9 May 2014 (UTC)[reply]

Human groups

[edit]

Hi, your bot makes one error often. It adds human-related properties to items that represent human groups, examples: James and Oliver Phelps (Q343954), Arkady and Boris Strugatsky (Q153796), Paolo and Vittorio Taviani (Q351697), Auguste and Louis Lumière (Q55965). Could you make additional check in bot`s code: if item has has part(s) (P527) property then do not touch the item. Or check instance of (P31). — Ivan A. Krestinin (talk) 18:28, 8 May 2014 (UTC)[reply]

@Ivan A. Krestinin: you're right about this, but I also noticed that often enough those items don't have has part(s) (P527) or instance of (P31), or if they do have instance of (P31) it's set to human (Q5). Changing the script would mean that the bot would make all those mistakes once again, so it would probably cause more trouble than it would avoid.--Underlying lk (talk) 07:24, 9 May 2014 (UTC)[reply]
has part(s) (P527) or instance of (P31) check is way to prevent edit wars. I revert your bot`s edits many times, but bot add invalid claims again. I try to improve data quality, but your bot makes this task hard. — Ivan A. Krestinin (talk) 10:58, 9 May 2014 (UTC)[reply]
@Ivan A. Krestinin: Did that happen since your last message?--Underlying lk (talk) 11:05, 9 May 2014 (UTC)[reply]
The latest invalid edit that I saw is [3]. I will notify you if will see its again. — Ivan A. Krestinin (talk) 11:16, 9 May 2014 (UTC)[reply]

Merging multiple numbers

[edit]

Hi. Wikidata:Database reports/Constraint violations/P1083#"Range" violations (and possibly others) show a couple of edits where you transformed infobox statements like "Capacity: 18,000, 2,000 seated" to something like "Capacity: 180002000". Please try to take more care parsing numbers. --YMS (talk) 14:49, 17 May 2014 (UTC) PS: Found next type immediately in the constraints list for P1092: from "Number built: about 450-500", your bot made "total produced: -500". --YMS (talk) 14:54, 17 May 2014 (UTC)[reply]

@YMS: Hi, and sorry for the late reply. This is the regex I'm currently using to capture numbers:
(?:^|[\s\-–~>:約])((?:(?<!\d)-?)\d[\d,\.\'\s]*)(\smillion)?(?:[\s<+/\}人]|$)
It requires that the number, if preceded by a minus sign, should not come after a digit, so perhaps the edit was made when I was using an older version of this regex. The other mistake happened because the space was accepted as part of the number, since they're often used as thousand separators.--Underlying lk (talk) 18:47, 21 May 2014 (UTC)[reply]

labels bot

[edit]

Hello, do you have some script which works similar as claimit.py, but adds descriptions? There are many categories, which have old description in czech Kategorie Wikipedie and we would like to change it to Kategorie Wikimedie. Similar task would be about templates, project pages etc.

My idea is: description.py -pagesgenerator [-overwrite], the description(s) would be hard-coded in script. JAn Dudík (talk) 19:07, 27 May 2014 (UTC)[reply]

errors in harvest_template

[edit]

Hello, I am using your older versions of harves_template and claimit. After some changes in other core scripts I got following error message:

...
Processing [[cs:Kategorie:Clenove Strany pokrokovych socialistu]]
WARNING: Claim.getType is DEPRECATED, use Property.type instead.
[[cs:Strana pokrokovych socialistu]] doesn't exist so I can't link to it
...

I found this change, but after changing I got new errors:

 
...
Processing [[cs:Kategorie:Alba Vanguard Records]]
ERROR: TypeError: 'unicode' object is not callable
Traceback (most recent call last):
  File "D:\Py\rewrite\scripts\ht2.py", line 89, in run
    self.procesPage(page)
  File "D:\Py\rewrite\scripts\ht2.py", line 294, in procesPage
    if claim.type() == 'wikibase-item':
TypeError: 'unicode' object is not callable
...

What to do for scripts working again without errors? I need mainly timedate harvesting and harvesting items without brackets, the other can be done by core scripts. JAn Dudík (talk) 06:23, 12 June 2014 (UTC)[reply]

 Resolved This bug was in site.py. After implementing gerit change scripts works again fine. JAn Dudík (talk) 11:42, 16 June 2014 (UTC)[reply]

Using remove claims.py with a list of "Q items"

[edit]

Hi, I'm trying to remove some incorrect statements added by my bot. I've got a list of these items in a file but the script doesn't seems to find them: Processing [[wikidata:Q103767]] No Wikidata item for this page. It's strange because with harvest_template.py it works without any problem. Could you have a look at it please? Thanks. — Ayack (talk) 10:32, 13 September 2014 (UTC)[reply]

BTW, I use python pwb.py remove_claims.py -lang:wikidata -family:wikidata -file:Liste2.txt P625. — Ayack (talk) 10:34, 13 September 2014 (UTC)[reply]


Spouse: Year 2014

[edit]

Seems I replicated some of your import. Example: here. --- Jura 19:11, 22 October 2014 (UTC)[reply]

Incorrect instance of United Kingdom

[edit]

This bot added several incorrect instance of (P31) for United Kingdom (Q145). For example on Secretary of State for Health and Social Care (Q3397406) and Chancellor of the Exchequer (Q531471). A bot should be careful about adding instance of (P31) since it has a very specific meaning. See the help.

I'm curious. Do you know why this bot added these claims? Jefft0 (talk) 14:11, 27 October 2014 (UTC)[reply]

@Jefft0: The bot was fetching claims from the post parameter of en:Template:Infobox official post, which in some cases erroneously contained United Kingdom instead of linking to the article about the post. Apologies.--Underlying lk (talk) 09:11, 30 October 2014 (UTC)[reply]

Your bot is adding instance of (P31) instead of subclass of (P279) when importing types of artillery pieces from infobox of Russian Wikipedia

[edit]

Hello. Check, for example, M2A2 Terra-Star (Q4043346). Is it problem of ru-wiki? Thank you beforehands. Ain92 (talk) 09:39, 3 November 2014 (UTC)[reply]

https://www.wikidata.org/w/index.php?title=Q216946&diff=117949224&oldid=116098764 I reverted this mistake--Oursana (talk) 15:42, 22 January 2015 (UTC)[reply]

sources of INE data

[edit]

Dear Underling lk,
your UnderlyingBot did a change of INE data. In german Wikipedia there are some critics and questions, maybe you can give some answers? If you need translation help, please give a ping. Regards, Conny (talk) 12:50, 5 June 2015 (UTC).[reply]

Pywikibot

[edit]

Hello. Can I ask you some questions about the use of pywikibot? I can add claims from infobox parameters (using harvest) and categories (using claimit). I wonder if there a way to add claims from a wikipedia table or from a list. Xaris333 (talk) 17:31, 29 July 2015 (UTC)[reply]

Hi Xaris, I'm sorry but I haven't contributed anything to Wikidata for a long time, and I have not operated a bot for even longer, so I'm not really in a position to help much!--Underlying lk (talk) 02:46, 14 August 2015 (UTC)[reply]

\n in quantity produced by this bot

[edit]

Looks like this bot (User:UnderlyingBot) produces broken data. E.g. in Indian Wells Tennis Garden (Q1427761) in property maximum capacity (P1083) the data is "+16100\n" which is not correct number for quantity. You can see https://www.wikidata.org/wiki/Special:EntityData/Q1427761.json for the raw data to ensure the item is broken. See also: https://phabricator.wikimedia.org/T110728 --Smalyshev (WMF) (talk) 19:57, 28 August 2015 (UTC)[reply]

Your bot and Property:P18

[edit]

Special:Diff/119634110 looks like a bug in your bot. The bot copied a file name to Wikidata because there is a file in the article on English Wikipedia, but the file is a local file and not a file on Commons. Commons has a completely unrelated file with the same name, though. I think that your bot should check whether the file used on Wikipedia is a local file or a file on Commons before adding information about the file to Wikidata. --Stefan2 (talk) 21:03, 24 September 2015 (UTC)[reply]

Broken founding date

[edit]

See this edit. --Yair rand (talk) 16:17, 17 November 2015 (UTC)[reply]

strange "license"s

[edit]

You may have produced several of these strange statements. While in some cases it may be seen as a (somehow useful) creative way of using P275, but in this case I can't even find that company mentioned in the given source. What's going on there? Can you fix it?--Frysch (talk) 16:16, 4 December 2015 (UTC)[reply]

Weird future publication dates

[edit]

See this edit; same thing here and here. I guess many video games from this list as well. -- LaddΩ chat ;) 03:03, 18 January 2016 (UTC)[reply]

UnderlyingBot

[edit]

Your bot has been listed at Wikidata:Requests for permissions/Removal/Inactive bot accounts as being inactive for over two years. As a housekeeping measure it's proposed to remove the bot flag from inactive bot accounts, unless you expect the bot will be operated again in the near future. If you consent to the removal of the bot flag (or do not reply on the deflag page) you can rerequest the bot flag at Wikidata:Requests for permissions/Bot should you need it again. Of course, You may request retaining your bot flag here if you need the bot flag. Regards--GZWDer (talk) 12:44, 26 June 2017 (UTC)[reply]

Ranks for historical data

[edit]

Please don't change them to "deprecated"! This rank is for wrong data. Just mark current data with "preferred". --Infovarius (talk) 12:21, 22 October 2018 (UTC)[reply]

Okay, will do.--Underlying lk (talk) 17:11, 22 October 2018 (UTC)[reply]

year 4305? (your bots first edit was 4 years ago, maybe you fixed it now?)

[edit]

Why did your bot do edits like these where the supposed year is 4305 where it also adds that year to multiple languages? In Swedish "Datorspel från 4305". Before thinking it was wrong I assumed good faith and looked it up "4305" in the English Wikipedia and found no reference at all...oh wait I found something, there's a polygon article that has this link http://www.polygon.com/2013/5/7/4305926/double-fine-humble-bundle-brutal-legend-psychonauts-mac-linux-pc oh :) ... now I see. Dbfyinginfo (talk) 17:43, 25 January 2019 (UTC)[reply]

When adding population (P1082) values to entries, please mark the most recent value with preffered rank. For example at Waldenburg (Q20085). Otherewise item will have two (or more) "current" values. -- VlSergey (трёп) 06:57, 11 March 2019 (UTC)[reply]

Quickstatements cannot change ranks, unfortunately.--Underlying lk (talk) 09:30, 11 March 2019 (UTC)[reply]
May be it shouldn't be used for population property then. -- VlSergey (трёп) 10:20, 11 March 2019 (UTC)[reply]

Falsches Datum?

[edit]

Du hast als Zeitpunkt/Stand "31. Dezember 2017" eingetragen. Die von dir zitierte Fundstelle nennt aber "Alle politisch selbständigen Gemeinden mit ausgewählten Merkmalen am 31.12.2018 (4. Quartal) (Deutsch)". --Eduard47 (talk) 08:39, 11 March 2019 (UTC)[reply]

True, but if you open the xls file it says "Bevölkerung am 31.12.2017".--Underlying lk (talk) 09:30, 11 March 2019 (UTC)[reply]
Sorry, du hast Recht. Aber häufig existieren bereits jüngere Daten, siehe Marne (Q542799), Husum (Q21159). --Eduard47 (talk) 09:56, 11 March 2019 (UTC)[reply]

Help:Add localized label to Wikidata

[edit]

Hey, I'd like to add a label in Kabyle to [4], how can I do that? there's no "add language" option. Thanks in advance. Sami At Ferḥat (talk) 20:08, 11 March 2019 (UTC)[reply]

@Sami At Ferḥat:: you first need to add this code to your user page: {{#babel: kab-3}} (or 2, or 4, depending on your knowledge of the language). I've done it for you, hope you don't mind.--Underlying lk (talk) 20:16, 11 March 2019 (UTC)[reply]
ty so much! Sami At Ferḥat (talk) 22:28, 11 March 2019 (UTC)[reply]

Population Data for North Hassia

[edit]

Hello Underlying lk,

eventually I found your wonderful work with imported data by this edit[5]. Such a great and very helpful work to include such officially available data by bot! Before i was not even aware, that this data could be entered - and BINGO they are so useful !!! Based on this data you can find automatically created diagrams in articles of wikipedia :-o See example in https://de.wikipedia.org/wiki/Stormbruch#Demographie

It took me nearly ONE HOUR !!! to enter the pure naked population data for this ONE little village - without references (which are noted in the wiki-article).

You (your Bot) could be very helpful to add some data. In many cases this would be the first the first statement of population data in objects for municipalitys. The key-reference in this official statistics is "Gemeindeschlüssel" = German municipality key in wikidata: https://www.wikidata.org/wiki/Property:P439 (details see https://de.wikipedia.org/wiki/Amtlicher_Gemeindeschlüssel#Deutschland ) Keep in mind: for all data in this statistics the Regional key = "06" https://www.wikidata.org/wiki/Property:P1388 Thus it might be useful to update missing regional keys.

I don't know which data formats you can handle best. Just let's begin with an dedicated example for 2009/2010 data and my request to include this data in wikidata-objects:

  • Table SP.1-25 Bevölkerung insgesamt am 31. Dezember 2009 (this is the point in time for wikidata)
  • Row in this table: Bevölkerung insgesamt (total summ of people 2009 for all listed municipalitys)

All data of this rows should be imported to existing data objects in wikipedia.


Just lets do some examples to see how it can work:

  • Hessen (State of Hassia Germany) https://www.wikidata.org/wiki/Q1199
    • Data = 6061951 (total inhabitants)
    • Point in time = 31. Dezember 2009
    • German municipality key /Gemeindeschlüssel= 000000 (special case of "Gemeindeschlüssel" because it is the complete state)
    • Regional key = 06
    • Reference: Hessisches Statistisches Landesamt: Hessische Gemeindestatistik 2010, webarchive online
  • Reg.-Bez. Darmstadt (Darmstadt Government Region) https://www.wikidata.org/wiki/Q7932
    • Data = 3792941
    • Point in time = 31. Dezember 2009
    • German municipality key / Gemeindeschlüssel = 400000
    • Regional key = 06
    • Reference: Hessisches Statistisches Landesamt: Hessische Gemeindestatistik 2010, webarchive online
  • Frankfurt am Main https://www.wikidata.org/wiki/Q1794
    • Data= 671927
    • Point in time = 31. Dezember 2009 (the entry for 2009 is missing; but attention: some existing other points might need a correction based on this more thrustworthy data)
    • German municipality key / Gemeindeschlüssel= 412000
    • Regional key = 06
    • Reference: Hessisches Statistisches Landesamt: Hessische Gemeindestatistik 2010, webarchive online


Well buddy i don't know wether you like my suggestion to let your bot do some work. Shure that you might have to solve some problems; nevertheless I think it could be an enormous advantage and great time saving for our project. Data for one year alone could help .... every year more helps more. :-)

Best --Thombansen (talk) 10:54, 9 May 2019 (UTC)[reply]

@Thombansen: Hi Thom, I looked at the links you provided and they only seem to cover Hessen. Searching around I found this: 12411-01-01-5 Bevölkerung nach Geschlecht - Stichtag 31.12. - regionale Tiefe: Gemeinden, which covers the 2008-2017 period. I also found another page with data going back to 1975. Perhaps we could use those instead?--Underlying lk (talk) 12:48, 9 May 2019 (UTC)[reply]
@Underlying lk: hey your'e so smart :-) very fine. whatever fits your needs for getting your bot working best will be fine - sources from https://www.regionalstatistik.de and https://www.destatis.de are official = trustworthy/reliable. If you like to you can start with this. LoL for some other (even more difficult requests) I might contact you later if this is ok for you. Best --Thombansen (talk) 13:26, 9 May 2019 (UTC) PS. Just had a deeper look in the files (1995.xls) of destatis.de If the bot runs he could probably check/add references the data for "Fläche in km2" https://www.wikidata.org/wiki/Property:P2046 and "Post-leitzahl"=Postal-Code https://www.wikidata.org/wiki/Property:P281[reply]
@Thombansen: Hi again, I started adding populations for German municipalities in 1975.--Underlying lk (talk) 00:54, 14 May 2019 (UTC)[reply]
@Underlying lk: very, very nice !!! More than 20.000 updates :-o I fixed description for Q63812020 Happy to see more data coming in. Cheers --Thombansen (talk) 11:30, 14 May 2019 (UTC)[reply]

Outdated population figures

[edit]

Hello Underlying lk,

When adding historical population figures, could you please either rank them as "deprecated" or rank the current figure as "preferred"? They outdated figures should not be on the same level with the recent ones. Thank you & kind regards, --RJFF (talk) 18:24, 31 August 2019 (UTC)[reply]

Lausanne

[edit]

Hello. I just saw that the page Q22374786 should be fused in Q807. I do not have the rights to do it and would appreciate if you can. Thanks in advance. 83.228.229.210 21:21, 2 September 2019 (UTC).[reply]

Community Insights Survey

[edit]

RMaung (WMF) 17:37, 10 September 2019 (UTC)[reply]

Reminder: Community Insights Survey

[edit]

RMaung (WMF) 19:53, 20 September 2019 (UTC)[reply]

Imported Date of Death

[edit]

Dear lk in Shepseskare Isi (Q268601) you imported a date of death from Italian site. What hint made you think that this jear might have been given as gregorian? --Vollbracht (talk) 19:59, 3 February 2022 (UTC)[reply]

You didn't react! If you enter historic data prior to 1584 the usage of Julian date instead of proleptic Gregorian is presumed to be agreed. According to ISO 8601 proleptic usage of Gregorian dates needs an agreement. You may get that in astronomy. You won't get it in history. --Vollbracht (talk) 11:20, 19 May 2022 (UTC)[reply]

incorrect population data

[edit]

hi, i dont know how wikidata works, but want to point you to an data error in an import here: https://www.wikidata.org/wiki/Wikidata:Report_a_technical_problem#inhabitant_number_incorrect_for_2016_for_the_village_Conques_in_france could you please look at it? looks very wrong in the chart now that is used on the wikipedia page of the village. 84.241.199.211 16:52, 21 November 2022 (UTC)[reply]

Call for participation in a task-based online experiment

[edit]

Dear Underlying_lk,

I hope you are doing well,

I am Kholoud, a researcher at King's College London, and I am working on a project as part of my PhD research, in which I have developed a personalised recommender model that suggests Wikidata items for the editors based on their past edits. I am inviting you to a task-based study that will ask you to provide your judgments about the relevance of the items suggested by our model based on your previous edits. Participation is completely voluntary, and your cooperation will enable us to evaluate the accuracy of the recommender system in suggesting relevant items to you. We will analyse the results anonymised, and they will be published to a research venue.

The study should take no more than 15 minutes.

If you agree to participate in this study, please either contact me at kholoud.alghamdi@kcl.ac.uk or use this form https://docs.google.com/forms/d/e/1FAIpQLSees9WzFXR0Vl3mHLkZCaByeFHRrBy51kBca53euq9nt3XWog/viewform?usp=sf_link

Then, I will contact you with the link to start the study.

For more information about the study, please read this post: https://www.wikidata.org/wiki/User:Kholoudsaa In case you have further questions or require more information, don't hesitate to contact me through my mentioned email.

Thank you for considering taking part in this research.

Regards Kholoudsaa (talk) 20:58, 17 February 2023 (UTC)[reply]