Dotfiles, utilities, and other apparatus.
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
 

43 lines
1.6 KiB

#!/bin/bash
# I don't know how to jq.
#
# The idea here is a rough guess at patches merged since start of the Wikimedia
# Hackathon 2024.
#
# References:
#
# - https://gerrit.wikimedia.org/r/q/status:merged+mergedafter:%222024-05-03+00:00:00+%252B0300%22,50
# - https://stackoverflow.com/questions/44497533/jq-transform-unix-timestamp-to-datetime
# - Pretty sure date results here will be UTC
# Get a list of patches since midnight May 3rd in Tallinn. This doesn't
# account for pagination, so check that if you run again. I think I used
# gerrit-query here instead of curling to the API so as not have to
# cross-reference numeric user IDs.
ssh -p 29418 gerrit.wikimedia.org gerrit query --format json --submit-records status:merged 'mergedafter:{2024-05-03 00:00:00 +0300}' > results.json
# Trim trailing line with summary:
sed -i '$ d' results.json
# Dates are unix timestamps - get human readable ones:
jq '.createdOn |= todateiso8601 | .lastUpdated |= todateiso8601' results.json > results_filtered.json
# Pull out relevant fields. Assumptions include that the submitter will be the
# first thing in submitRecords, which honestly I have no idea if that holds.
#
# There's also an @tsv available in jq, but on the version I was running it
# produces quoted strings per-row with \t escapes instead of an actual TSV,
# which is... Not actually TSV.
jq '[
.project,
.url,
.subject,
.owner.name,
.owner.email,
.submitRecords[0].labels[1].by.name,
.submitRecords[0].labels[1].by.email,
.createdOn,
.lastUpdated
] | @csv' \
results_filtered.json > results.csv