0

i have a file which has that json array

[
  {'sector':'1','info':[{'id':'1234','pos':'55039974','risk':'low'},{'id':'3333','pos':'44530354','risk':'middle'}]},
  {'sector':'2','info':[{'id':'2434','pos':'45455554','risk':'high'},{'id':'4444','pos':'4454555','risk':'high'}]}
]

as a single line

[{'sector':'1','info':[{'id':'1234','pos':'55039974','risk':'low'},{'id':'3333','pos':'44530354','risk':'middle'}]},{'sector':'2','info':[{'id':'2434','pos':'45455554','risk':'high'},{'id':'4444','pos':'4454555','risk':'high'}]}]

in a file as line 2. how can i read just line 2 which is that json array out of that file and print it with

print(str(lines[0]['sector'][0]['id']))

? i dont want to use eval function, because if i use a huge array like 1E6 entries eval increase my RAM up to 6Gb. Thats the reason why i try to figure out how it would works with other functions to read a string and convert it to an array. thx for any help

Update: i tried:

with open('file.txt') as f:
  """ 
  v1: 
  t = f.readlines()[2]
  s = json.loads(t)
  print(s[0]['sector'])
  v1 output: 
  Traceback (most recent call last): s = json.loads(t)
    File "/usr/lib/python3.7/json/__init__.py", line 348, in loads
     return _default_decoder.decode(s)
     ...
    ...
  json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)

  v2:
  t = f.readlines()[2]
  x = json.dumps(t, indent=4)
  #x = json.dumps(t)
  s = json.loads(x)
  print(s[0]['sector'])
  v2 output:
  TypeError: string indices must be integers

  v3:
  t = f.readlines()[2]
  x = json.dumps(t, indent=4)
  s = json.JSONDecoder().decode(json.JSONEncoder().encode(x))
  #s = json.JSONDecoder().decode(x)
  #s = json.JSONEncoder().encode(x)
  print(s[0]['sector'])
  v3 output:
  TypeError: string indices must be integers
  """

Update 2 The array is the problem! this constalation without [ and sector 2 works

{"sector":"1","info":[{"id":"1234","pos":"55039974","risk":"low"},{"id":"RS11591147x","pos":"44530354","risk":"middle"}]}

Update 3 - (closed) the array is ok, but i had to replace the ' to " in each array. and the memory just increase from 450Mb up to 700Mb. thats a huge to 6Gb. the problem is that the txt files increase vom 30Mb to 50Mb but who cares? xD

whereIsMyArray = 1
with open('file.txt', 'r') as fp:
  line = fp.readlines()[whereIsMyArray].replace("'", '"')
  data = json.loads(line)
print(data[1]['info'][1]["risk"])
2
  • "how can i read just line 2" The same way that you would ordinarily read a line from file. "i dont want to use eval function" And you shouldn't; you should parse the resulting string with the json standard library module, just as you would any other json data. "because if i use a huge array like 1E6 entries eval increase my RAM up to 6Gb." There is no way to avoid the memory usage, if you have to keep 1e6 parsed JSON fragments in memory at a time. They take up as much space as they do; your file is as big as it is. Commented Nov 30, 2020 at 23:58
  • It's not clear what the difficulty is, since you don't show the code for your existing approach, except the part using the parsed JSON result. "How do I read a line from a file as a string" and "how do I parse a string as json" are two separate questions, and each is better answered by searching or by going directly to the documentation. Commented Dec 1, 2020 at 0:01

1 Answer 1

1

You could try with json library function json.loads:

import json

with open('yourfile.txt') as f:
    s=f.readlines()[1]

result=json.loads(s)
Sign up to request clarification or add additional context in comments.

1 Comment

@karl-knechtel and IoaTzimas see update in my question. i tried 3 different kind of versions but no one worls. any idea what i do wrong?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.