extraction of multiple occurrences of variable data from large string

I have a very long string in a text file.
It is basically the below string repeated around 1000 times (as one long string, not 1000 strings).
The string has variables which change with each repetition (those in bold).
I'd like to extract the variables in an automated way, and return the output into either a CSV or formatted txt file (Random Bank, Random Rate, Random Product)
I can do this successfully using https://regex101.com, however it involves a lot of manual copy&paste.
I'd like to write a bash script to automate extracting the information, but have no luck in attempting various grep commands.
How can this be done? (I'd also consider doing in Java).

[{"AccountName":"Random Product","AccountType":"Variable","AccountTypeId":1,"AER":Random Rate,"CanManageByMobileApp":false,"CanManageByPost":true,"CanManageByTelephone":true,"CanManageInBranch":false,"CanManageOnline":true,"CanOpenByMobileApp":false,"CanOpenByPost":false,"CanOpenByTelephone":false,"CanOpenInBranch":false,"CanOpenOnline":true,"Company":"Random Bank","Id":"S9701Monthly","InterestPaidFrequency":"Monthly"


This is JSON formatted data which you can't parse with regular expression engines. Get a JSON parser. If this file is larger than, say, 1GB, find one that lets you 'stream' (which is the term for parsing it and dealing with the data as it parses, vs the more usual route which turns the entire input into an object; if the file is huge, that object'd be huge, might run out of memory - hence you'd need the streaming aspect).

Here is one tutorial for Jackson-streaming.