Dear Open Hub Users,
We’re excited to announce that we will be moving the Open Hub Forum to
https://community.blackduck.com/s/black-duck-open-hub.
Beginning immediately, users can head over,
register,
get technical help and discuss issue pertinent to the Open Hub. Registered users can also subscribe to Open Hub announcements here.
On May 1, 2020, we will be freezing https://www.openhub.net/forums and users will not be able to create new discussions. If you have any questions and concerns, please email us at
info@openhub.net
The title pretty much says it: I would like to download all files returned by a code search query. Is there a build in function for that or do I need to write some kind of crawler?
Also, are there any legal issues with doing this? I do not want to replicate the functionality provided by code search or anything. My intention is to analyse certain file types for research purposes and I need these files locally available.
Joerg,
At the moment Ohloh Code is still (I think...) in beta and I'm pretty sure we don't yet have any API functions available for that service. I'll pass along your suggestion, however, and perhaps something will emerge when that issue is examined.
Legally, you should be on solid ground as long as the projects you're researching have open-source licenses. I think THIS can be determined via our already-existing API.
Thanks!
Hi,
thanks! I wasn't entirely sure when it comes to Black Duck's policies, but thanks for the notification.
Regards
I have a more or less similar question.
I want to see all the responses in one page (now I only get 10 per page).
I already tried to mingle with the get-parameters but pp
does not stand for results per-page ;-)
Also, can I combine searches with and AND and/or OR operator?
Sincerely yours
Can you post the example code here? Also, see how we accomplished the same here