我正在嘗試解決有關使用緩存并行獲取 URL 以避免重復的任務。我找到了正確的解決方案并且可以理解它。我看到正確的答案包含通道,并且 gorutine 通過 chan 將 URL 推送到緩存中。但為什么我的簡單代碼不能正常工作?我不知道哪里出錯了。package mainimport ( "fmt" "sync")type Fetcher interface { // Fetch returns the body of URL and // a slice of URLs found on that page. Fetch(url string) (body string, urls []string, err error)}var cache = struct { cache map[string]int mux sync.Mutex}{cache: make(map[string]int)}// Crawl uses fetcher to recursively crawl// pages starting with url, to a maximum of depth.func Crawl(url string, depth int, fetcher Fetcher) { // TODO: Fetch URLs in parallel. // TODO: Don't fetch the same URL twice. // This implementation doesn't do either: if depth <= 0 { return } cache.mux.Lock() cache.cache[url] = 1 //put url in cache cache.mux.Unlock() body, urls, err := fetcher.Fetch(url) if err != nil { fmt.Println(err) return } fmt.Printf("found: %s %q\n", url, body) for _, u := range urls { cache.mux.Lock() if _, ok := cache.cache[u]; !ok { //check if url already in cache cache.mux.Unlock() go Crawl(u, depth-1, fetcher) } else { cache.mux.Unlock() } } return}func main() { Crawl("http://golang.org/", 4, fetcher)}// fakeFetcher is Fetcher that returns canned results.type fakeFetcher map[string]*fakeResulttype fakeResult struct { body string urls []string}func (f fakeFetcher) Fetch(url string) (string, []string, error) { if res, ok := f[url]; ok { return res.body, res.urls, nil } return "", nil, fmt.Errorf("not found: %s", url)}// fetcher is a populated fakeFetcher.var fetcher = fakeFetcher{ "http://golang.org/": &fakeResult{ "The Go Programming Language", []string{ "http://golang.org/pkg/", "http://golang.org/cmd/", }, },
1 回答

慕尼黑5688855
TA貢獻1848條經驗 獲得超2個贊
在所有調用完成之前,您main()不會阻塞,因此退出。go Crawl()您可以使用 async.WaitGroup或通道來同步程序,以完成所有 goroutine。
u我還發現goroutine 中使用的變量存在問題;當 goroutine 執行時,u范圍循環可能會或可能不會更改 的值。
結束Crawl可能會像這樣解決這兩個問題;
wg := sync.WaitGroup{}
fmt.Printf("found: %s %q\n", url, body)
for _, u := range urls {
cache.mux.Lock()
if _, ok := cache.cache[u]; !ok { //check if url already in cache
cache.mux.Unlock()
wg.Add(1)
go func(url string) {
Crawl(url, depth-1, fetcher)
wg.Done()
}(u)
} else {
cache.mux.Unlock()
}
}
// Block until all goroutines are done
wg.Wait()
return
- 1 回答
- 0 關注
- 115 瀏覽
添加回答
舉報
0/150
提交
取消