נושא בדף שיחת משתמש:Matanya

IKhitron (שיחהתרומות)

תודה.

Matanya (שיחהתרומות)

לא קיבלתי.

IKhitron (שיחהתרומות)

שלחתי מחדש.

Matanya (שיחהתרומות)

אין כזה.

IKhitron (שיחהתרומות)

יכול להיות שיש לך בעיה במייל? או בספאם?

IKhitron (שיחהתרומות)

מתניה?

Matanya (שיחהתרומות)

אני מקבל בסדר מכל אחד אחר. אתה בג'ימייל ?

IKhitron (שיחהתרומות)

כן. ושלחתי כמה מכתבים, גם דרך וויקיפדיה, וגם ישר לכתובת שלך.

Matanya (שיחהתרומות)

מוזר, תפוס אותי בIRC.

IKhitron (שיחהתרומות)

תבדוק את הגדרות המייל שלך, בלי קשר לוויקיפדיה. והספאם. ב-IRC זה עוד יותר גרוע. כל הקטע שרציתי במייל הוא כי זה ארוך מדי. אבל אין ברירה, אכתוב כאן בקיצור. יש מצב שתריץ בשבילי 1200 פעמים סקריפט ב-pywikibot שערן המליץ עליו? אני יכול לכתוב קובץ הרצה ב-tcsh, נגיד, כדי שתוכל רק ללחוץ על אנטר והוא יעשה את כל הסיבובים. תודה מראש.

Matanya (שיחהתרומות)

כן, ואין צורך שתכתוב tsch.

IKhitron (שיחהתרומות)

תודה רבה! אבל בלי זה תצטרך להקליד 1200 שורות קוד. למה?

Matanya (שיחהתרומות)

כי אני מסוגל לכתוב בבאש.

IKhitron (שיחהתרומות)

לא הבנתי למה אתה רוצה לכתוב 1200 שורות בבאש, אבל בוא נניח לזה בינתיים. אני מדבר על הצורך במחיקת 1500 דפי הפניה ממרחב שיחה למרחב שיחה. ל-1200 מדפי ההפניה יש דפים מקושרים אליהם, וצריך לתקן את הדפים הללו להיות מקושרים ליעד ההפניה. קיים בשביל זה סקריפט . יש מצב שתגיד לי מה הפרמטרים של שורת הפקודה? תודה.

Matanya (שיחהתרומות)

fixing_redirects.py -help יענה על זה

IKhitron (שיחהתרומות)

בדיוק. ומכיוון שאין לי את זה ואין אפשרות להתקין, ביקשתי את עזרתך. תודה.

Matanya (שיחהתרומות)
$

python pwb.py scripts/fixing_redirects.py -help

Correct all redirect links in featured pages or only one page of each wiki.

Can be using with:
-catfilter        Filter the page generator to only yield pages in the
                  specified category. See -cat for argument format.

-cat              Work on all pages which are in a specific category.
                  Argument can also be given as "-cat:categoryname" or
                  as "-cat:categoryname|fromtitle" (using # instead of |
                  is also allowed in this one and the following)

-catr             Like -cat, but also recursively includes pages in
                  subcategories, sub-subcategories etc. of the
                  given category.
                  Argument can also be given as "-catr:categoryname" or
                  as "-catr:categoryname|fromtitle".

-subcats          Work on all subcategories of a specific category.
                  Argument can also be given as "-subcats:categoryname" or
                  as "-subcats:categoryname|fromtitle".

-subcatsr         Like -subcats, but also includes sub-subcategories etc. of
                  the given category.
                  Argument can also be given as "-subcatsr:categoryname" or
                  as "-subcatsr:categoryname|fromtitle".

-uncat            Work on all pages which are not categorised.

-uncatcat         Work on all categories which are not categorised.

-uncatfiles       Work on all files which are not categorised.

-file             Read a list of pages to treat from the named text file.
                  Page titles in the file may be either enclosed with
                  [[brackets]], or be separated by new lines.
                  Argument can also be given as "-file:filename".

-filelinks        Work on all pages that use a certain image/media file.
                  Argument can also be given as "-filelinks:filename".

-search           Work on all pages that are found in a MediaWiki search
                  across all namespaces.

-logevents        Work on articles that were on a specified Special:Log.
                  The value may be a comma separated list of three values:

                      logevent,username,total

                  To use the default value, use an empty string.
                  You have options for every type of logs given by the
                  log event parameter which could be one of the following:

                      block, protect, rights, delete, upload, move, import,
                      patrol, merge, suppress, review, stable, gblblock,
                      renameuser, globalauth, gblrights, abusefilter, newusers

                  It uses the default number of pages 10.

                  Examples:

                  -logevents:move gives pages from move log (usually redirects)
                  -logevents:delete,,20 gives 20 pages from deletion log
                  -logevents:protect,Usr gives pages from protect by user Usr
                  -logevents:patrol,Usr,20 gives 20 patroled pages by user Usr

                  In some cases it must be written as -logevents:"patrol,Usr,20"

-namespaces       Filter the page generator to only yield pages in the
-namespace        specified namespaces. Separate multiple namespace
-ns               numbers or names with commas.
                  Examples:

                  -ns:0,2,4
                  -ns:Help,MediaWiki

                  If used with -newpages/-random/-randomredirect,
                  -namespace/ns must be provided before
                  -newpages/-random/-randomredirect.
                  If used with -recentchanges, efficiency is improved if
                  -namespace/ns is provided before -recentchanges.

                  If used with -start, -namespace/ns shall contain only one
                  value.

-interwiki        Work on the given page and all equivalent pages in other
                  languages. This can, for example, be used to fight
                  multi-site spamming.
                  Attention: this will cause the bot to modify
                  pages on several wiki sites, this is not well tested,
                  so check your edits!

-limit:n          When used with any other argument that specifies a set
                  of pages, work on no more than n pages in total.

-links            Work on all pages that are linked from a certain page.
                  Argument can also be given as "-links:linkingpagetitle".

-liverecentchanges Work on pages from the live recent changes feed. If used as
                  -liverecentchanges:x, work on x recent changes.

-imagesused       Work on all images that contained on a certain page.
                  Argument can also be given as "-imagesused:linkingpagetitle".

-newimages        Work on the 100 newest images. If given as -newimages:x,
                  will work on the x newest images.

-newpages         Work on the most recent new pages. If given as -newpages:x,
                  will work on the x newest pages.

-recentchanges    Work on the pages with the most recent changes. If
                  given as -recentchanges:x, will work on the x most recently
                  changed pages.

-unconnectedpages Work on the most recent unconnected pages to the Wikibase
                  repository. Given as -unconnectedpages:x, will work on the
                  x most recent unconnected pages.

-ref              Work on all pages that link to a certain page.
                  Argument can also be given as "-ref:referredpagetitle".

-start            Specifies that the robot should go alphabetically through
                  all pages on the home wiki, starting at the named page.
                  Argument can also be given as "-start:pagetitle".

                  You can also include a namespace. For example,
                  "-start:Template:!" will make the bot work on all pages
                  in the template namespace.

                  default value is start:!

-prefixindex      Work on pages commencing with a common prefix.

-step:n           When used with any other argument that specifies a set
                  of pages, only retrieve n pages at a time from the wiki
                  server.

-subpage:n        Filters pages to only those that have depth n
                  i.e. a depth of 0 filters out all pages that are subpages, and
                  a depth of 1 filters out all pages that are subpages of subpages.

-titleregex       A regular expression that needs to match the article title
                  otherwise the page won't be returned.
                  Multiple -titleregex:regexpr can be provided and the page will
                  be returned if title is matched by any of the regexpr
                  provided.
                  Case insensitive regular expressions will be used and
                  dot matches any character.

-transcludes      Work on all pages that use a certain template.
                  Argument can also be given as "-transcludes:Title".

-unusedfiles      Work on all description pages of images/media files that are
                  not used anywhere.
                  Argument can be given as "-unusedfiles:n" where
                  n is the maximum number of articles to work on.

-lonelypages      Work on all articles that are not linked from any other
                  article.
                  Argument can be given as "-lonelypages:n" where
                  n is the maximum number of articles to work on.

-unwatched        Work on all articles that are not watched by anyone.
                  Argument can be given as "-unwatched:n" where
                  n is the maximum number of articles to work on.

-usercontribs     Work on all articles that were edited by a certain user.
                  (Example : -usercontribs:DumZiBoT)

-weblink          Work on all articles that contain an external link to
                  a given URL; may be given as "-weblink:url"

-withoutinterwiki Work on all pages that don't have interlanguage links.
                  Argument can be given as "-withoutinterwiki:n" where
                  n is the total to fetch.

-mysqlquery       Takes a Mysql query string like
                  "SELECT page_namespace, page_title, FROM page
                  WHERE page_namespace = 0" and works on the resulting pages.

-wikidataquery    Takes a WikidataQuery query string like claim[31:12280]
                  and works on the resulting pages.

-searchitem       Takes a search string and works on Wikibase pages that
                  contain it.
                  Argument can be given as "-searchitem:text", where text
                  is the string to look for, or "-searchitem:lang:text", where
                  lang is the langauge to search items in.

-random           Work on random pages returned by [[מיוחד:Random|Special:Random]].
                  Can also be given as "-random:n" where n is the number
                  of pages to be returned, otherwise the default is 10 pages.

-randomredirect   Work on random redirect pages returned by
                  [[מיוחד:RandomRedirect|Special:RandomRedirect]]. Can also be given as
                  "-randomredirect:n" where n is the number of pages to be
                  returned, else 10 pages are returned.

-untagged         Work on image pages that don't have any license template on a
                  site given in the format "<language>.<project>.org, e.g.
                  "ja.wikipedia.org" or "commons.wikimedia.org".
                  Using an external Toolserver tool.

-google           Work on all pages that are found in a Google search.
                  You need a Google Web API license key. Note that Google
                  doesn't give out license keys anymore. See google_key in
                  config.py for instructions.
                  Argument can also be given as "-google:searchstring".

-yahoo            Work on all pages that are found in a Yahoo search.
                  Depends on python module pYsearch.  See yahoo_appid in
                  config.py for instructions.

-page             Work on a single page. Argument can also be given as
                  "-page:pagetitle", and supplied multiple times for
                  multiple pages.

-grep             A regular expression that needs to match the article
                  otherwise the page won't be returned.
                  Multiple -grep:regexpr can be provided and the page will
                  be returned if content is matched by any of the regexpr
                  provided.
                  Case insensitive regular expressions will be used and
                  dot matches any character, including a newline.

-ql               Filter pages based on page quality.
                  This is only applicable if contentmodel equals
                  'proofread-page', otherwise has no effects.
                  Valid values are in range 0-4.
                  Multiple values can be comma-separated.

-onlyif           A claim the page needs to contain, otherwise the item won't
                  be returned.
                  The format is property=value,qualifier=value. Multiple (or
                  none) qualifiers can be passed, separated by commas.
                  Examples: P1=Q2 (property P1 must contain value Q2),
                  P3=Q4,P5=Q6,P6=Q7 (property P3 with value Q4 and
                  qualifiers: P5 with value Q6 and P6 with value Q7).
                  Value can be page ID, coordinate in format:
                  latitude,longitude[,precision] (all values are in decimal
                  degrees), year, or plain string.
                  The argument can be provided multiple times and the item
                  page will be returned only if all of the claims are present.
                  Argument can be also given as "-onlyif:expression".

-onlyifnot        A claim the page must not contain, otherwise the item won't
                  be returned.
                  For usage and examples, see -onlyif above.

-intersect        Work on the intersection of all the provided generators.

-featured         Run over featured pages (for some wikimedia wikis only)

Run fixing_redirects.py -help to see all the command-line
options -file, -ref, -links, ...



Global arguments available for all bots:

-dir:PATH         Read the bot's configuration data from directory given by
                  PATH, instead of from the default directory.

-lang:xx          Set the language of the wiki you want to work on, overriding
                  the configuration in user-config.py. xx should be the
                  language code.

-family:xyz       Set the family of the wiki you want to work on, e.g.
                  wikipedia, wiktionary, wikitravel, ...
                  This will override the configuration in user-config.py.

-user:xyz         Log in as user 'xyz' instead of the default username.

-daemonize:xyz    Immediately return control to the terminal and redirect
                  stdout and stderr to file xyz.
                  (only use for bots that require no input from stdin).

-help             Show this help text.

-log              Enable the log file, using the default filename
                  'fixing_redirects-bot.log'
                  Logs will be stored in the logs subdirectory.

-log:xyz          Enable the log file, using 'xyz' as the filename.

-nolog            Disable the log file (if it is enabled by default).

-maxlag           Sets a new maxlag parameter to a number of seconds. Defer bot
                  edits during periods of database server lag. Default is set by
                  config.py

-putthrottle:n    Set the minimum time (in seconds) the bot will wait between
-pt:n             saving pages.
-put_throttle:n

-debug:item       Enable the log file and include extensive debugging data
-debug            for component "item" (for all components if the second form
                  is used).

-verbose          Have the bot provide additional console output that may be
-v                useful in debugging.

-cosmeticchanges  Toggles the cosmetic_changes setting made in config.py or
-cc               user_config.py to its inverse and overrules it. All other
                  settings and restrictions are untouched.

-simulate         Disables writing to the server. Useful for testing and
                  debugging of new code (if given, doesn't do any real
                  changes, but only shows what would have been changed).

-<config var>:n   You may use all given numeric config variables as option and
                  modify it with command line.
IKhitron (שיחהתרומות)

תודה רבה לך! אבל יש לי הרגשה שזה לא הכלי הנכון, לאחר שקראתי. אני אדבר עם ערן.

ערן (שיחהתרומות)

יגאל, אני חושב שהסקריפט יתאים לתיקון הפניות אבל הכי טוב שתבדוק אותו על דף אחד בצורה מבוקרת ועוד יותר טוב אם תרפרף על הקוד - תעבור על הפונקציה treat_page בלינק שקישרת אליו. הקוד בסך הכל צריך להיות קריא גם למי שלא מדבר פייתון

IKhitron (שיחהתרומות)

אני אנסה, ערן, אבל לפני זה יש לי שאלה אליך. ממה שקראתי כאן אני מאמין שאם ניתן לזה שם של דף, זה יתקן את כל הקישורים להפניות בדף הזה, ולא רק אלו, שאנו צריכים - דבר שהזהרת אותי ממנו. אני טועה?

ערן (שיחהתרומות)

נכון. זה לא מתקן הפניה ספציפית

IKhitron (שיחהתרומות)

אז אם כך, ערן, זה אומר שזה לא מסוכן מבחינתך?

ערן (שיחהתרומות)
IKhitron (שיחהתרומות)

אני מניח שאלו יהיו כ-10,000 עריכות, ערן. אין שום אפשרות לבדוק ידנית את כולם, לא?

ערן (שיחהתרומות)

כן אי אפשר לבדוק את כולן בצורה ידנית

IKhitron (שיחהתרומות)

אז מה עושים, ערן? להתעלם מהסכנה או לא להריץ את זה?

ערן (שיחהתרומות)

כן עדיף לא להריץ את זה

IKhitron (שיחהתרומות)

הבנתי. תודה, אחשוב על הכלי השני שהצעת. תודה רבה לך, מתניה, אבל נראה שזה לא הולך.

תגובה ל"@"