Announcement

Collapse
No announcement yet.

Jon00 DataScraper/JSON Parser Script For Homeseer 3 and Homeseer 4

Collapse
This topic is closed.
X
This is a sticky topic.
X
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Thanks for the update.

    As I said on the other thread, there is no need to use a script and 2 devices as an intermediary to parse the JSONP into JSON.

    Pattern0 can be used to extract part of the downloaded string (for Pattern1 etc) which in the case below will provide the correct JSON string from the JSONP string.

    So all you need is the following:

    Code:
    [Grab5]
    Path=https://www.dwd.de/DWD/warnungen/warnapp/json/warnings.json
    TextFile=0
    DeviceMode=0
    SSLMode=2
    Pattern0=(?s)\((.*?)\);
    Pattern1=JSON>warnings.106535000.0.description
    DeviceName1=DWD3
    DeviceText1=[0]
    DeviceValue1=
    DeviceImage1=
    All done in one hit!

    If you want to get rid of the icon next to the text, just set DeviceMode=2
    Jon

    Comment


      Thanks again Jon,
      that would have saved me a lot of time 😉 but I was glad to have done something on my own concerning HomeSeer !
      Your work is probably more elegant !!!

      Fischi

      Comment


        Hello Jon,
        I've just tested and it runs perfect and much faster than mine !
        I've seen that the parsed JSON content shows m²​ instead of m² (in Englisch it means sqare meter). Is there a setting in the .ini or so to display it correctly like in the JSONP ?

        What is the Jon00DataScraperJSON script for ? I'm using only the Jon00DataScraper script and have success.

        Fischi

        Comment


          Looks like a character encoding issue. See if this resolves the issue.

          Add the following under [Grab5] entries:

          Code:
          ContentType=application/json
          Encoding=UTF-8
          Next you need to convert Jon00DataScraperData.ini file to be encoded in UTF-8.

          Using NotePad ++ , Open up Jon00DataScraperData.ini in this editor.

          Select Encoding from the top menu and select Convert to UTF-8. Then close the editor.

          Finally run the Datascraper script to update the data on the virtual device. Hopefully that will now display correctly.

          As for the Jon00DataScraperJSON script, it is run from the main script for processing the JSON decoding. The code was separated from the main script as it prevented breaking everyone's setup when they upgraded to the new version with the JSON decoding capability. This is because the JSON parsing requires an additional entry to the ScriptingReferences entry in settings.ini.



          Jon

          Comment


            V1.0.35 is now available.

            This version adds a new device mode for HS4 users.
            Jon

            Comment


              Just a quick question.
              Does the scraper support basic auth?
              I really have no clue about scraping, but will try to learn.
              But to get there I need to be able to login first.

              I will try to use the TIGO JSON API for solar optimizers.

              From their API guide.
              The curl example is working, but how to use token in your scraper?
              ​​
              ​Click image for larger version

Name:	image.png
Views:	159
Size:	57.5 KB
ID:	1596977​

              Comment


                Looks like I never added headers for 'GET' (required for the Authorization Bearer) but can easily add that.

                Happy to work via PM if you want.
                Jon

                Comment


                  Originally posted by jon00 View Post
                  Looks like I never added headers for 'GET' (required for the Authorization Bearer) but can easily add that.

                  Happy to work via PM if you want.
                  Thanks for helping with this the last time. Still works like a charm
                  Now my next problem is getting production data from all panels.

                  The API has aggregated values for each panel and I get those values just fine, but i need to get the last minute for each query.
                  Can this be done without a helper script?

                  This is what the query looks like
                  Click image for larger version

Name:	image.png
Views:	116
Size:	24.2 KB
ID:	1621146​

                  And it returns

                  Click image for larger version

Name:	image.png
Views:	110
Size:	10.9 KB
ID:	1621147​

                  And so on for all panels.

                  Comment


                    You would have to post what Datascraper has downloaded in the Jon00Datascraperdata.ini file and highlight what you want to extract.
                    Jon

                    Comment


                      Originally posted by jon00 View Post
                      You would have to post what Datascraper has downloaded in the Jon00Datascraperdata.ini file and highlight what you want to extract.
                      Hi.

                      The scraping part is ok. I think I will get that working.

                      The issue is that I need the start and end date to be "NOW -1min" and change the date/time for each time I query.
                      Maybe I explain it wrong, but it is to always get the last minute of production.

                      Click image for larger version

Name:	image.png
Views:	117
Size:	27.1 KB
ID:	1621155​

                      Comment


                        Apologies but I deal with so many people, I cannot remember the full details of your case. I remember adding 'GET' headers for you but that was about it.

                        So is this the URL you use in the Path= entry under [GrabX] in the Jon00DataScraper.ini file? If so, you have added the two time periods to the URL query to obtain the current last minute data? .... but you want that done automatically so it always gets the last minute by changing the two time periods automatically when the script is run?
                        Jon

                        Comment


                          Originally posted by jon00 View Post
                          Apologies but I deal with so many people, I cannot remember the full details of your case. I remember adding 'GET' headers for you but that was about it.

                          So is this the URL you use in the Path= entry under [GrabX] in the Jon00DataScraper.ini file? If so, you have added the two time periods to the URL query to obtain the current last minute data? .... but you want that done automatically so it always gets the last minute by changing the two time periods automatically when the script is run?
                          Hi.

                          Sorry for not being more clear, but yes, you are correct.

                          This is probably not all correct, but close to it.
                          Only thing missing is how to change the time period automatically.

                          Code:
                          [Grab2]
                          Path = https://api2.tigoenergy.com/api/v3/data/aggregate?system_id=75923&level=min&end=2023-06-21T10:25:00&start=2023-06-21T10:25:00
                          TextFile=1
                          Encoding=
                          Username=
                          Password=
                          Options=
                          UserAgent=
                          Devicemode=4
                          StripHTML=1
                          UseIE=0
                          SSLMode=0
                          GroupDevices=1
                          PostData=
                          ContentType=application/json
                          AcceptHeader=application/json
                          Header1=Authorization, Bearer XXXXXXX
                          Header2=
                          Header3=
                          
                          Pattern1=JSON>85158127
                          Pattern2=JSON>85158128
                          Pattern3=JSON>85158129
                          Pattern4=JSON>85158130
                          Pattern5=JSON>85158131
                          
                          DeviceName1=A1
                          DeviceText1=[500]
                          DeviceValue1=[500] WH
                          DeviceImage1=
                          Speakbutton1=0
                          TriggerString1=
                          SearchMode1=1
                          TriggerEvent1=​​

                          Comment


                            OK, but it would be both the date and time that would need to be calculated (Now minus 1 minute) so it would work over midnight?

                            It's not possible now but I can add a tag to do that.
                            Jon

                            Comment


                              Originally posted by jon00 View Post
                              OK, but it would be both the date and time that would need to be calculated (Now minus 1 minute) so it would work over midnight?

                              It's not possible now but I can add a tag to do that.
                              I was afraid of that.
                              Great if this will be implemented in a future version.
                              Thanks again for excellent support!

                              Comment


                                I'll send you a new version to try (via PM) shortly.

                                I've added a new tag: [iso8601time] which will give the current time in the iso8601 time format.

                                You can also add or subtract minutes using:

                                [iso8601time+5] (add 5 minutes)
                                [iso8601time-1] (subtract 1 minute)

                                Use these in your Path:

                                Path = https://api2.tigoenergy.com/api/v3/d...level=min&end=[iso8601time-1]&start=[iso8601time-1]
                                Jon

                                Comment

                                Working...
                                X