Announcement

Collapse
No announcement yet.

Help with hs.GetURL

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Help with hs.GetURL

    I have this code MARK 1

    <pre class="ip-ubbcode-code-pre">
    sub main()
    sRawWeb = hs.GetURL("www.tvnz.co.nz","",False,80)

    'Find the positions in Page
    iPos1 = InStr(10,sRawWeb,"&lt;!-- Vignette")
    iPos1 = InStr(iPos1+1,sRawWeb,"&lt;table")
    iPos2 = InStr(iPos1+1,sRawWeb,"&lt;/table")


    Temp = mid(sRawWeb,ipos1,ipos2-ipos1+8)

    temp = replace(Temp ,"What's On","What's On ONE")

    hs.SetDeviceString "z1",temp
    hs.setdevicelastchange "z1", now()

    iPos1 = InStr(iPos2+1,sRawWeb,"&lt;table")
    iPos2 = InStr(iPos1+1,sRawWeb,"&lt;/table")

    Temp = mid(sRawWeb,ipos1,ipos2-ipos1+8)
    temp = replace(Temp ,"What's On","What's On TWO")

    </pre>

    Problem is

    when I run it from a event it show the data from the last time I open it in explorer manually

    e.g
    i I run it as is it shows me data from last explorer open

    if i go there with the explorer
    then run the event again the data is right.

    why ?

    what am I doing something wrong ?????????????????

    StePhan McKillen

    My Pages

    [This message was edited by StePhan on Wed, 27 August 2003 at 05:04 AM.]

    #2
    There is probably a proxy/cache server that is telling HomeSeer that it already has the latest copy of the page, and then the reason it loads after you surf to it is that you might have your browser configured to always get the latest version (e.g. check the date/time at each page visit). In Internet Exploder (or Control Panel/Internet), go to Tools... Internet Options.. and then there on the General tab click Settings on the Temporary Internet files section. Choose the option to "Check for newer version of stored pages" on "Every visit to the page" and see if this helps.


    Regards,

    Rick Tinker
    HomeSeer Technologies

    Regards,

    Rick Tinker (a.k.a. "Tink")

    Comment


      #3
      I've recently tried the two Weatherbug-based weather packages which both use hs.GetURL. In my case I was able to retrieve the "nearest stations" page, but all requests to actually get data from that station return an empty page. If you continue to have problems you can use the GetWebPage.exe application to replace hs.GetURL in your script.

      Comment


        #4
        Michael,

        I have GetWebFile.exe (not GetWebPage.exe) in my HS root.
        Same program?
        What is the syntax for its useage?

        regards,

        GenevaDude

        Comment


          #5
          That Fix it

          StePhan McKillen

          My Pages

          Comment

          Working...
          X