i have a file which has that json array
[
{'sector':'1','info':[{'id':'1234','pos':'55039974','risk':'low'},{'id':'3333','pos':'44530354','risk':'middle'}]},
{'sector':'2','info':[{'id':'2434','pos':'45455554','risk':'high'},{'id':'4444','pos':'4454555','risk':'high'}]}
]
as a single line
[{'sector':'1','info':[{'id':'1234','pos':'55039974','risk':'low'},{'id':'3333','pos':'44530354','risk':'middle'}]},{'sector':'2','info':[{'id':'2434','pos':'45455554','risk':'high'},{'id':'4444','pos':'4454555','risk':'high'}]}]
in a file as line 2. how can i read just line 2 which is that json array out of that file and print it with
print(str(lines[0]['sector'][0]['id']))
? i dont want to use eval function, because if i use a huge array like 1E6 entries eval increase my RAM up to 6Gb. Thats the reason why i try to figure out how it would works with other functions to read a string and convert it to an array. thx for any help
Update: i tried:
with open('file.txt') as f:
"""
v1:
t = f.readlines()[2]
s = json.loads(t)
print(s[0]['sector'])
v1 output:
Traceback (most recent call last): s = json.loads(t)
File "/usr/lib/python3.7/json/__init__.py", line 348, in loads
return _default_decoder.decode(s)
...
...
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)
v2:
t = f.readlines()[2]
x = json.dumps(t, indent=4)
#x = json.dumps(t)
s = json.loads(x)
print(s[0]['sector'])
v2 output:
TypeError: string indices must be integers
v3:
t = f.readlines()[2]
x = json.dumps(t, indent=4)
s = json.JSONDecoder().decode(json.JSONEncoder().encode(x))
#s = json.JSONDecoder().decode(x)
#s = json.JSONEncoder().encode(x)
print(s[0]['sector'])
v3 output:
TypeError: string indices must be integers
"""
Update 2 The array is the problem! this constalation without [ and sector 2 works
{"sector":"1","info":[{"id":"1234","pos":"55039974","risk":"low"},{"id":"RS11591147x","pos":"44530354","risk":"middle"}]}
Update 3 - (closed) the array is ok, but i had to replace the ' to " in each array. and the memory just increase from 450Mb up to 700Mb. thats a huge to 6Gb. the problem is that the txt files increase vom 30Mb to 50Mb but who cares? xD
whereIsMyArray = 1
with open('file.txt', 'r') as fp:
line = fp.readlines()[whereIsMyArray].replace("'", '"')
data = json.loads(line)
print(data[1]['info'][1]["risk"])
jsonstandard library module, just as you would any other json data. "because if i use a huge array like 1E6 entries eval increase my RAM up to 6Gb." There is no way to avoid the memory usage, if you have to keep 1e6 parsed JSON fragments in memory at a time. They take up as much space as they do; your file is as big as it is.