Export JSON Report


Export JSON report

pyraider go -e json result.json
pyraider check -f requirements.txt -e json result.json

Export JSON report by severity

Supported severities

  1. high
  2. medium
  3. low
pyraider go -e json result.json -s high
pyraider check -f requirements.txt -e json result.json -s high

You should get a result file like this.

{
"pyraider": [
{
"django": {
"current_version": "1.11.13",
"update_to": "3.0.5",
"cwe": "89",
"cve": "CVE-2020-9402",
"severity": "HIGH",
"description": "Django 1.11 before 1.11.29, 2.2 before 2.2.11, and 3.0 before 3.0.4 allows SQL Injection if untrusted data is used as a tolerance parameter in GIS functions and aggregates on Oracle. By passing a suitably crafted tolerance to GIS functions and aggregates on Oracle, it was possible to break escaping and inject malicious SQL."
}
},
{
"urllib3": {
"current_version": "1.25.8",
"update_to": "1.25.8",
"cwe": "400",
"cve": "CVE-2020-7212",
"severity": "HIGH",
"description": "The _encode_invalid_chars function in util/url.py in the urllib3 library 1.25.2 through 1.25.7 for Python allows a denial of service (CPU consumption) because of an inefficient algorithm. The percent_encodings array contains all matches of percent encodings. It is not deduplicated. For a URL of length N, the size of percent_encodings may be up to O(N). The next step (normalize existing percent-encoded bytes) also takes up to O(N) for each step, so the total time is O(N^2). If percent_encodings were deduplicated, the time to compute _encode_invalid_chars would be O(kN), where k is at most 484 ((10+6*2)^2)."
}
}
],
"version": "1.0.3"
}