Greyhound Mystique

Any markets not covered in the other boards
Post Reply
User avatar
MemphisFlash
Posts: 2335
Joined: Fri May 16, 2014 10:12 pm

thanks, been a long project but i think they are done now.
they look good, work great, and i've learnt a lot of advanced techniques in Excel thanks to some fellow members on here.
User avatar
murdok
Posts: 151
Joined: Sun Apr 02, 2017 7:10 pm

worst day ever at greyhounds the market was always against me lol :? :twisted: :twisted:
ArticalBadboy
Posts: 106
Joined: Tue Feb 14, 2017 1:43 pm

Good Morning, hope you're having a nice Christmas

Could someone assist me please?

I'd like to learn how this the python code works so I can adapt it.

Q1. What are k and v in this line of code please: x = sorted(((k, v) for k, v in Runners.items()))
Q2. The code cycles through and compares each selections average with the other selections averages, but how does it know to only return the lowest/quickest?
Q3. This leads on to my requirement. How would I adapt the code to show selections if the 6th slowest average is >=0.1 slower than the 5th slowest average

Edit:
Q4. I've just worked out, the 'Time Difference' figure is the difference in the Last Race and not the difference in the overall average...surely this is not correct?
e.g. First line in Todays scrape = 11:06 Nottingham Tue 29 December 2020, 5. Apache Girl, class A4, time difference 0.38

Best, Last, Average,
Apache Girl 31.16, 30.31, 30.74,
Cracker Be Slick 30.88, 30.69, 30.79,
Difference -0.28, 0.38, 0.05,

This would mean Apache Girl is not a real selection as the average difference is 0.05... less than the required 0.1

Q5. I don't think the 'Best' data on the Sporting life racecard page is correct!
Just looking at Apache Girl above, the last run of 30.31 is clearly quicker the the stated Best time of 31.16
When I looked through the form, the dog has run a 30.25
Can the code be adapted to look through the form dropdown for each dog...this would correctly identify the best time AND perhaps only look through races of the same length as the racecard?

Thank you in advance for any help you can give.

Kind Regards
spreadbetting
Posts: 3140
Joined: Sun Jan 31, 2010 8:06 pm

Q1. Just sorting the runners into the fastest times. The code is simply grabbing the best time it can and the dog name, no averages, just best time. If no best time it simply retruns 100 so that would be the worst time. On the first grab they're in trap order, after the sort they're in lowest to higher order.

Q2. As per above it's only coded to grab the best time, no averages.

Q3. You'd add more elements to your array . i.e. add the slowest time and sort by that data, then order by that data and compare the first two dogs.

Q4. Again the code was only ever grabbing the Best time regardless of anything else.

Q5. You could well be right, I'd never bothered checking the SportingLife data against other sources as it was thrown together simply because I was trying to learn python at the time and it seemed a decent project to kick off with. Realistically if the data is being sent to a browser it can be scraped, how easy that becomes is down to how it's rendered on the webpage by your browser, but python has lots of modules that are coded for scraping there's bound to be one that'd would work. Once you've managed to gather the data all the rest like omitting different race distances etc should be very easy as you'd just be doing comparison and sorting.
ArticalBadboy
Posts: 106
Joined: Tue Feb 14, 2017 1:43 pm

Thanks for the reply SB

I think i understand...I mistakenly thought the 'extract_times' was a function that extracted both Best and Last from gh-racing-runner-greyhound-sub-info.

Thank you
spreadbetting
Posts: 3140
Joined: Sun Jan 31, 2010 8:06 pm

extract_times was just to grab the best time (times_regex), the last bit in the regex was just as an end marker of the string and so last could be stripped too. Sometimes there would only be one time so it'd grab that (regex best_times_regex) and if there were no times to grab it'd simply return 100.

I can't remember off the top of my head what

Code: Select all

if float(match.group(2)) < float(match.group(1)):
            return float(match.group(2))
        else:
            return round((float(match.group(1)) + float(match.group(2))) / 2, 2)
Was set to do but it picked out the best (match.group(1)) and last time (match.group(2)) and returned them as per the calculation, so if the last time was better than the best (god knows why sporting life didn't report it as best) it'd return the last time, otherwise it added the best and last and divided by two.

Like I said it was always more of an exercise in python rather than fully blown strategy.
Emmson
Posts: 3577
Joined: Mon Feb 29, 2016 6:47 pm

Always fascinated what effect an outsized lump on a dog market has. That one on 1st ladder just sat there like a bump on a log, didn't come and go or was pulled up or down, just sat there like a bump on a long and was hardly touched and then pulled 22 seconds after post time.

I have come to the conclusion manual trading the dogs using a market making method is the easiest way to turn a profit on betfair, downside is it doesn't bring in big rewards and it breaks down if you up the stakes and it fries the brain if you do it for an extended period.
You do not have the required permissions to view the files attached to this post.
Emmson
Posts: 3577
Joined: Mon Feb 29, 2016 6:47 pm

A late afternoon dog race at Newcastle (about 1/2 mile from where i live) attracting 70K
You do not have the required permissions to view the files attached to this post.
Archery1969
Posts: 4478
Joined: Thu Oct 24, 2019 8:25 am

On most races you will notice something happens on the favourite in the final minute of trading.

- A large amount of cash is stuck on one side or the other forcing the price up or down for x amount of ticks. Normally its on the back side forcing the price up. The amount usually sticks out like a sore thumb, over £100+

Just jump in front. I aim for a 6:2 ratio between takeprofit and stoploss.

Have fun,
Emmson
Posts: 3577
Joined: Mon Feb 29, 2016 6:47 pm

Being a dog person I like to think my small scale trading is benefiting greyhound welfare in some way, no idea if it is but I hope so, something skimmed off the turnover or whatever, Absolutely horrified at some expose of Aus greyhound industry I saw on YT recently.
spreadbetting
Posts: 3140
Joined: Sun Jan 31, 2010 8:06 pm

ArticalBadboy wrote:
Tue Dec 29, 2020 2:25 pm
Thanks for the reply SB

I think i understand...I mistakenly thought the 'extract_times' was a function that extracted both Best and Last from gh-racing-runner-greyhound-sub-info.

Thank you
I did a little tinkering today and removed the extract_times routine so it now simply checks the previous history to pick out the fastest time over the race distance instead of relying on Sporting life as they seemed to be a bit hit and miss. I'm sure the code will fall over at some point as it's only been run a couple of times today and bound to come across key errors or times where expected data isn't present. But I'll post it up as it may be of use to some as it can be amended to average the last 5 races, or races at the course etc

Code: Select all

import json
import re
import requests
import pprint
from bs4 import BeautifulSoup
from requests_html import HTMLSession


def main():
    session = HTMLSession()
    baseUrl = "https://www.sportinglife.com"
    results = []
    res = requests.get("https://www.sportinglife.com/greyhounds/racecards")
    soup = BeautifulSoup(res.text, "html.parser")
    summary = list(filter(None, [link.find_parent('a', class_='') for link in soup.find_all('span', class_='gh-meeting-race-time')]))
    
    x = 0
    for tag in summary:
        link = tag.get('href')
        res = session.get(baseUrl + link)
        soup = BeautifulSoup(res.text, "html.parser")
        race = soup.find('h1').get_text()
        grade = soup.find(class_='gh-racecard-summary-race-class gh-racecard-summary-always-open').get_text()
        distance = soup.find(class_='gh-racecard-summary-race-distance gh-racecard-summary-always-open').get_text().strip("m")
        
        Runners = dict()
        print((baseUrl + link))
        
        data =json.loads( soup.find('script', type='application/json').string) #convert to dictionary
        
        for runner in data['props']['pageProps']['race']['runs']:
            
            
            Trap = runner['cloth_number']
            Name = runner['greyhound']['name']
            
            fastest_time=100
            for time in runner['greyhound']['previous_results']:
               
                if time['distance'] == distance and time['run_time'] !="":
                   
                    
                    if float(time['run_time'].strip("s")) <= fastest_time:
                        
                        fastest_time=float(time['run_time'].strip("s"))
                    

            
            Runners[fastest_time] = str(Trap) + '. ' + Name
                        
         
        if bool(Runners) == True and ('OR' in grade or 'A' in grade):
            
            x = sorted(((k, v) for k, v in Runners.items()))
           

            if (x[1][0] - x[0][0]) >= 0.1:
                timeDiff = round((x[1][0] - x[0][0]), 2)
                results.append(race + ', ' + x[0][1] + ', class ' + grade + ', time difference ' + str(timeDiff))
    results.sort()
    file = open('dogs_1.txt', mode='w')
    for line in results: file.write(line+'\n')
    file.close()
    
main()
ArticalBadboy
Posts: 106
Joined: Tue Feb 14, 2017 1:43 pm

Thank you for sharing SB...much appreciated!
spreadbetting
Posts: 3140
Joined: Sun Jan 31, 2010 8:06 pm

No problem, Vladimir CC had cleaned up a lot of the code I originally put up so it was a small tweak to have it access the previous race history.

You can tweak code to change which parts of the code are used to get your 'fastest time' now from the dogs history. I tweaked the line

Code: Select all

if time['distance'] == distance and time['run_time'] !="":
to

Code: Select all

 if time['distance'] == distance and time['run_time'] !="" and "2020" in time['date']:
so it only checks race history from 2020 as plenty of races go back as far as 2017 etc but can get it match up course or SP's now as well as set dates I guess
User avatar
murdok
Posts: 151
Joined: Sun Apr 02, 2017 7:10 pm

thanks for the script here is the result
19:30 Pelaw Grange Sat 2 January 2021, 4. Parkers Prince, class A5, time difference 0.72
19:32 Romford Sat 2 January 2021, 3. Playtime Girl, class A5, time difference 0.11
19:41 Sheffield Sat 2 January 2021, 5. Swift Colossus, class A4, time difference 0.32
19:44 Monmore Sat 2 January 2021, 6. Dhustone Smile, class A6, time difference 0.23
19:48 Crayford Sat 2 January 2021, 2. Pennys Bobby, class A1, time difference 0.11
19:52 Romford Sat 2 January 2021, 6. Dromulton Bonnie, class A7, time difference 0.13
19:58 Sheffield Sat 2 January 2021, 3. Keefill Orla, class A7, time difference 0.33
20:12 Sheffield Sat 2 January 2021, 4. Roedhelm Tiger, class A2, time difference 0.18
20:17 Hove Sat 2 January 2021, 2. Get On Beckie, class A9, time difference 13.34
20:23 Monmore Sat 2 January 2021, 3. Supreme Ripon, class A2, time difference 0.22
20:27 Sheffield Sat 2 January 2021, 6. Bartlemy Seamus, class A8, time difference 0.34
20:38 Hove Sat 2 January 2021, 1. Warnham Sky, class A4, time difference 0.11
21:02 Monmore Sat 2 January 2021, 2. Tommys Sunshine, class A3, time difference 0.35
21:07 Romford Sat 2 January 2021, 3. Chopchop Lucy, class A4, time difference 0.26
21:12 Hove Sat 2 January 2021, 5. Mercury Athena, class A11, time difference 0.63
21:18 Monmore Sat 2 January 2021, 6. Kurtas Carranna, class A1, time difference 0.24
21:28 Hove Sat 2 January 2021, 2. Lisneal Twister, class A6, time difference 0.14
21:45 Sheffield Sat 2 January 2021, 2. Lightfoot Jones, class A5, time difference 0.26

ArticalBadboy
Posts: 106
Joined: Tue Feb 14, 2017 1:43 pm

I was following the prices on Betdaq.
I'll be honest, I'm not seeing much correlation between the selections and their prices coming in.
Post Reply

Return to “Other Betfair Sports Trading markets”