Skip to content

Commit fa7d50e

Browse files
committed
fix R CMD check url issues & CRAN submission
1 parent 3c96e9f commit fa7d50e

11 files changed

+22
-33
lines changed

CRAN-SUBMISSION

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
Version: 0.7.15
2-
Date: 2024-08-24 13:03:06 UTC
3-
SHA: d2ab4788e97bfad0a6e7b7a7c3b70938be954a5e
2+
Date: 2024-08-25 07:16:38 UTC
3+
SHA: 3c96e9f6872735f123da4dea2404c8f8df94810f

DESCRIPTION

+1-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
Package: robotstxt
2-
Date: 2024-08-24
2+
Date: 2024-08-25
33
Type: Package
44
Title: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
55
Version: 0.7.15
@@ -24,7 +24,6 @@ Description: Provides functions to download and parse 'robots.txt' files.
2424
(spiders, crawler, scrapers, ...) are allowed to access specific
2525
resources on a domain.
2626
License: MIT + file LICENSE
27-
LazyData: TRUE
2827
BugReports: https://github.com/ropensci/robotstxt/issues
2928
URL: https://docs.ropensci.org/robotstxt/, https://github.com/ropensci/robotstxt
3029
Imports:

R/get_robotstxt.R

+2-3
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,8 @@
88
#' @param user_agent HTTP user-agent string to be used to retrieve robots.txt
99
#' file from domain
1010
#'
11-
#' @param ssl_verifypeer analog to CURL option
12-
#' \url{https://curl.haxx.se/libcurl/c/CURLOPT_SSL_VERIFYPEER.html} -- and
13-
#' might help with robots.txt file retrieval in some cases
11+
#' @param ssl_verifypeer either 1 (default) or 0, if 0 it disables SSL peer verification, which
12+
#' might help with robots.txt file retrieval
1413
#' @param rt_robotstxt_http_getter function that executes HTTP request
1514
#' @param rt_request_handler handler function that handles request according to
1615
#' the event handlers specified

R/get_robotstxt_http_get.R

+2-5
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,8 @@ rt_last_http$request <- list()
99

1010
#' get_robotstxt() worker function to execute HTTP request
1111
#'
12-
#'
13-
#' @param ssl_verifypeer analog to CURL option
14-
#' \url{https://curl.haxx.se/libcurl/c/CURLOPT_SSL_VERIFYPEER.html}
15-
#' -- and might help with robots.txt file retrieval in some cases
16-
#'
12+
#' @param ssl_verifypeer either 1 (default) or 0, if 0 it disables SSL peer verification, which
13+
#' might help with robots.txt file retrieval
1714
#' @param domain the domain to get tobots.txt. file for
1815
#' @param user_agent the user agent to use for HTTP request header
1916
#'

R/get_robotstxts.R

+2-4
Original file line numberDiff line numberDiff line change
@@ -7,10 +7,8 @@
77
#' pages and vignettes of package future on how to set up
88
#' plans for future execution because the robotstxt package
99
#' does not do it on its own.
10-
#' @param ssl_verifypeer analog to CURL option
11-
#' \url{https://curl.haxx.se/libcurl/c/CURLOPT_SSL_VERIFYPEER.html}
12-
#' -- and might help with robots.txt file retrieval in some cases
13-
#'
10+
#' @param ssl_verifypeer either 1 (default) or 0, if 0 it disables SSL peer verification, which
11+
#' might help with robots.txt file retrieval
1412
#' @param rt_request_handler handler function that handles request according to
1513
#' the event handlers specified
1614
#'

cran-comments.md

+3-2
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
## R CMD check results
22

3-
0 errors | 0 warnings | 0 note
3+
0 errors | 0 warnings | 1 note
44

5-
* fixing all checks problems
5+
* fixing "incoming feasibility" URL checks problems
6+
* changing maintainer to Pedro Baltazar <[email protected]>

man/get_robotstxt.Rd

+2-3
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/get_robotstxt_http_get.Rd

+2-3
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/get_robotstxts.Rd

+2-3
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/paths_allowed.Rd

+2-3
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/robotstxt.Rd

+2-3
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)