JSON File Error

Hey @Lucia_Cisneros, I’m afraid your save has been corrupted, which is why it didn’t appear in the list. I’m not sure why this might have happened; I’ll let the rest of the team know. Basically, the metadata.json file got corrupted, and if I re-add metatada.json from another file, I get this in the stonehearth.log:

2016-09-22 17:25:26.548206 | server | 0 | simulation.core | Failed to load: could not parse object 406020 of 410419 (size:112).
2016-09-22 17:25:26.548206 | client | 0 | client.core | server load save file failed. returning to stonehearth title screen.
2016-09-22 17:25:26.548206 | client | 1 | mod radiant | lua controller lifetime tracking set to “nil”
2016-09-22 17:25:26.728206 | server | 0 | lua.code | lua panic. forcing application exit
2016-09-22 17:25:26.728206 | server | 2 | lua.code | generating traceback…
2016-09-22 17:25:26.728206 | server | 0 | lua.code | – Script Error (lua) Begin -------------------------------
2016-09-22 17:25:26.728206 | server | 0 | lua.code | events
2016-09-22 17:25:26.728206 | server | 0 | lua.code | stack traceback:
2016-09-22 17:25:26.728206 | server | 0 | lua.code | – Lua Error End -------------------------------
2016-09-22 17:25:26.728206 | server | 0 | lua.memory | destroying lua caching allocator
2016-09-22 17:25:26.728206 | client | 0 | sysinfo | Memory Stats: Fatal Exception
2016-09-22 17:25:26.728206 | client | 0 | sysinfo | Total System Memory: 15.876 GB (17046401024 bytes)
2016-09-22 17:25:26.728206 | client | 0 | sysinfo | Current Memory Usage: 411.855 MB (431861760 bytes)
2016-09-22 17:25:26.728206 | client | 0 | sysinfo | Total Address Space: 8.000 TB (8796092891136 bytes)
2016-09-22 17:25:26.728206 | client | 0 | sysinfo | Available Address Space: 7.999 TB (8794711007232 bytes)
2016-09-22 17:25:26.728206 | client | 0 | sysinfo | Used Address Space: 1.287 GB (1381883904 bytes)
2016-09-22 17:25:41.732244 | server | 0 | app | Assertion Failed: callbacks_.size() == 0(c:\rb\ihome\root\sh-ob0-build\stonehearth\source\core\slot.h:28)

We’ll keep taking a look but for now, I’m afraid there is no workaround at this time. :frowning:

2 Likes