[go: up one dir, main page]

Page MenuHomePhabricator

Invalid CSRF token error during high volume import into Wikibase Suite (WBS)
Open, Needs TriagePublicBUG REPORT

Description

Steps to replicate the issue (include links if applicable):

  • Set up instance of Wikibase Suite (WBS) on a VPS running on Ubuntu 20.04, including WikibaseLexeme extension
  • Set up OAuth 1.0 consumer for owner use only
  • Use wikibaseintegrator scripts and consumer tokens to import large volume of data stored on local machine

What happens?:
In the middle of long import session, script crashes with error

wikibaseintegrator.wbi_exceptions.MWApiError: 'Invalid CSRF token.'

Then when I do a sparql query involving the type of data item that was being imported during crash, the sparql query fails with the error

"Server error: Expected double-quoted property name in JSON at position 696315 (line 27859 column 4)

What should have happened instead?:
Import process should not have generated CSRF token error, and sparql query should have retrieved the newly imported items.

Software version (on Special:Version page; skip for WMF-hosted wikis like Wikipedia):
MediaWiki 1.42.1
Wikibase integrator 0.12

Other information (browser name/version, screenshots, etc.):
The problem seems related to the size of data imported, or perhaps length of import session. Container logs indicate maybe some function of the import process was being overwhelmed(?).

wdqs-frontend container:

[warn] 9#9: *539 an upstream response is buffered to a temporary file /var/cache/nginx/proxy_temp/2/00/0000000002 while reading upstream, client: 172.18.0.4, server: localhost, request: "POST /proxy/wdqs/bigdata/namespace/wdq/sparql HTTP/1.1", upstream: "http://172.18.0.10:80/bigdata/namespace/wdq/sparql", host: "wdqs-frontend.main.akkadi

After performing the sparql query, the wdqs container says:

java.util.concurrent.ExecutionException: java.util.concurrent.ExecutionException: org.openrdf.query.QueryEvaluationException: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.Exception: task=ChunkTask{query=7cf3142a-32c7-4bea-8171-bc3a2a6b2271,bopId=4,partitionId=-1,sinkId=5,altSinkId=null}, cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.OutOfMemoryError: Direct buffer memory

My import scripts do not cause a problem when uploading to an identically structured wikibase on wikibase-cloud.