library(dplyr)
library(stringr)
library(swfscDAS)
This document introduces you to the swfscDAS package, and specifically its functionality and workflow. This package is intended to standardize and streamline processing of shipboard DAS data collected using the WinCruz program from the Southwest Fisheries Science Center. In DAS data, an event is only recorded when something changes or happens, which can complicate processing. Thus, the main theme of this package is enabling analyses and downstream processing by 1) determining associated state and condition information for each event and 2) pulling out event-specific information from the Data (Field) columns.
This package includes a sample DAS file, which we will use in this document
<- system.file("das_sample.das", package = "swfscDAS")
y head(readLines(y))
#> [1] "1 B.062739 011313 N39:19.22 W137:36.26 1000 c 5 Y "
#> [2] "2 R.062739 011313 N39:19.22 W137:36.26 S "
#> [3] "3 P.062739 011313 N39:19.22 W137:36.26 280 001 126 "
#> [4] "4 V.062739 011313 N39:19.22 W137:36.26 3 03 230 10.0 "
#> [5] "5 N.062739 011313 N39:19.22 W137:36.26 023 09.8 "
#> [6] "6 W.062739 011313 N39:19.22 W137:36.26 1 250 6.0 "
The first step in processing DAS data is to ensure that the DAS file has expected formatting and values. This package contains the das_check
function, which performs some basic checks. This function is a precursor to a more comprehensive DASCHECK program, which is currently in development. The checks performed by this function are detailed in the function documentation, which can be accessed by running ?das_check
. You can find the PDF with the expected DAS data format at https://smwoodman.github.io/swfscDAS/, or see ?das_format_pdf
for how to access a local copy. To check for valid species codes using this function, you can pass the function an SpCodes.dat file
# Code not run
<- das_check(y, skip = 0, print.cruise.nums = TRUE) y.check
Once QA/QC is complete and you have fixed any data entry errors, you can begin to process the DAS data. The backbone of this package is the reading and processing steps: 1) the data from the DAS file are read into the columns of a data frame and 2) state and condition information are extracted for each event. This means that after processing, you can simply look at any event (row) and determine the Beaufort, viewing conditions, etc., at the time of the event. All other functions in the package depend on the DAS data being in this processed state.
One other processing note is that the ‘DateTime’ column in the das_read
output has a time zone of “UTC” - this default time zone value is meaningless and only present because POSIXct vectors must have exactly one time zone. If you need to do conversions based on time zone, then use the ‘OffsetGMT’ column and force_tz
and with_tz
from the lubridate
package. One reason to do such conversions would be if data was collected in a non-local time zone that causes the daily recorded effort to span more than one day, since then ‘OnEffort’ values will be incorrect because das_process
resets the event sequence to off effort at the beginning of a new day.
# Read
<- das_read(y, skip = 0)
y.read glimpse(y.read)
#> Rows: 259
#> Columns: 20
#> $ Event <chr> "B", "R", "P", "V", "N", "W", "V", "W", "W", "*", "P", "V", ~
#> $ EffortDot <lgl> TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, ~
#> $ DateTime <dttm> 2013-01-13 06:27:39, 2013-01-13 06:27:39, 2013-01-13 06:27:~
#> $ Lat <dbl> 39.32033, 39.32033, 39.32033, 39.32033, 39.32033, 39.32033, ~
#> $ Lon <dbl> -137.6043, -137.6043, -137.6043, -137.6043, -137.6043, -137.~
#> $ Data1 <chr> "1000", "S", "280", "3", "023", "1", "3", "1", "1", NA, "208~
#> $ Data2 <chr> "c", NA, "001", "03", "09.8", NA, "03", "02", NA, NA, "280",~
#> $ Data3 <chr> "5", NA, "126", "230", NA, NA, "230", "03", NA, NA, "001", "~
#> $ Data4 <chr> "Y", NA, NA, NA, NA, "250", NA, "257", "257", NA, NA, NA, NA~
#> $ Data5 <chr> NA, NA, NA, "10.0", NA, "6.0", "10.0", "6.0", "6.0", NA, NA,~
#> $ Data6 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "2.8~
#> $ Data7 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "1.0~
#> $ Data8 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "013~
#> $ Data9 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ Data10 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ Data11 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ Data12 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ EventNum <chr> "1", "2", "3", "4", "5", "6", "7", "8", "9", "10", "11", "12~
#> $ file_das <chr> "das_sample.das", "das_sample.das", "das_sample.das", "das_s~
#> $ line_num <int> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 1~
# Process
<- das_process(y)
y.proc glimpse(y.proc)
#> Rows: 256
#> Columns: 40
#> $ Event <chr> "B", "R", "P", "V", "N", "W", "V", "W", "W", "*", "P", "V", ~
#> $ DateTime <dttm> 2013-01-13 06:27:39, 2013-01-13 06:27:39, 2013-01-13 06:27:~
#> $ Lat <dbl> 39.32033, 39.32033, 39.32033, 39.32033, 39.32033, 39.32033, ~
#> $ Lon <dbl> -137.6043, -137.6043, -137.6043, -137.6043, -137.6043, -137.~
#> $ OnEffort <lgl> TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, ~
#> $ Cruise <dbl> 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, ~
#> $ Mode <chr> "C", "C", "C", "C", "C", "C", "C", "C", "C", "C", "C", "C", ~
#> $ OffsetGMT <int> 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, ~
#> $ EffType <chr> NA, "S", "S", "S", "S", "S", "S", "S", "S", "S", "S", "S", "~
#> $ ESWsides <dbl> NA, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,~
#> $ Course <dbl> NA, NA, NA, NA, 23, 23, 23, 23, 23, 23, 23, 23, 25, 25, 25, ~
#> $ SpdKt <dbl> NA, NA, NA, NA, 9.8, 9.8, 9.8, 9.8, 9.8, 9.8, 9.8, 9.8, 10.2~
#> $ Bft <dbl> NA, NA, NA, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, ~
#> $ SwellHght <dbl> NA, NA, NA, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, ~
#> $ WindSpdKt <dbl> NA, NA, NA, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, ~
#> $ RainFog <dbl> NA, NA, NA, NA, NA, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1~
#> $ HorizSun <dbl> NA, NA, NA, NA, NA, NA, NA, 2, NA, NA, NA, NA, NA, NA, NA, N~
#> $ VertSun <dbl> NA, NA, NA, NA, NA, NA, NA, 3, NA, NA, NA, NA, NA, NA, NA, N~
#> $ Glare <lgl> NA, NA, NA, NA, NA, NA, NA, FALSE, NA, NA, NA, NA, NA, NA, N~
#> $ Vis <dbl> NA, NA, NA, NA, NA, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6~
#> $ ObsL <chr> NA, NA, "280", "280", "280", "280", "280", "280", "280", "28~
#> $ Rec <chr> NA, NA, "001", "001", "001", "001", "001", "001", "001", "00~
#> $ ObsR <chr> NA, NA, "126", "126", "126", "126", "126", "126", "126", "12~
#> $ ObsInd <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ Data1 <chr> "1000", "S", "280", "3", "023", "1", "3", "1", "1", NA, "208~
#> $ Data2 <chr> "c", NA, "001", "03", "09.8", NA, "03", "02", NA, NA, "280",~
#> $ Data3 <chr> "5", NA, "126", "230", NA, NA, "230", "03", NA, NA, "001", "~
#> $ Data4 <chr> "Y", NA, NA, NA, NA, "250", NA, "257", "257", NA, NA, NA, NA~
#> $ Data5 <chr> NA, NA, NA, "10.0", NA, "6.0", "10.0", "6.0", "6.0", NA, NA,~
#> $ Data6 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "2.8~
#> $ Data7 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "1.0~
#> $ Data8 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "013~
#> $ Data9 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ Data10 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ Data11 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ Data12 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ EffortDot <lgl> TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, ~
#> $ EventNum <chr> "1", "2", "3", "4", "5", "6", "7", "8", "9", "10", "11", "12~
#> $ file_das <chr> "das_sample.das", "das_sample.das", "das_sample.das", "das_s~
#> $ line_num <int> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 1~
# Note that das_read can read multiple files at once
<- das_read(c(y, y)) y2.read
Once you have processed the DAS data, you can easily access a variety of information. For instance, you can look at the different events or Beaufort values that occurred in the data, or filter for specific events to get the beginning and ending points of each effort section.
# The number of each event
table(y.proc$Event)
#>
#> * 1 2 3 4 ? A B C E F N P R S V W s t
#> 90 8 7 6 2 1 8 2 5 10 1 19 18 10 8 22 26 7 6
# The number of events per Beaufort value
table(y.proc$Bft)
#>
#> 2 3
#> 94 120
# Filter for R and E events to extract lat/lon points
%>%
y.proc filter(Event %in% c("R", "E")) %>%
select(Event, Lat, Lon, Cruise, Mode, EffType) %>%
head()
#> Event Lat Lon Cruise Mode EffType
#> 1 R 39.32033 -137.6043 1000 C S
#> 2 E 39.36717 -137.5817 1000 C S
#> 3 R 39.37617 -137.5978 1000 C S
#> 4 E 39.51933 -137.5277 1000 C S
#> 5 R 39.56800 -137.4530 1000 C S
#> 6 E 39.75433 -137.4107 1000 C S
The swfscDAS
package does contain specific functions for extracting and/or summarizing particular information from the processed data. First is das_sight
, a function that returns a data frame with pertinent sighting data pulled out to their own columns. Due to the different data collected for each type of sighting and the different needs of end-users, this function offers several different return formats. For the “default” format there is one row for each species of each sighting, for the “wide” format, there is one row for each sighting event, and for the “complete” format there is one row for every group size estimate for each sighting. See ?das_sight
for more details. The “complete” format is intended for users looking to do observer corrections, or to combine estimates using a different method than the arithmetic mean (e.g. the geometric mean)
<- das_sight(y.proc, return.format = "default")
y.sight %>%
y.sight select(Event, SightNo:PerpDistKm) %>%
glimpse()
#> Rows: 25
#> Columns: 32
#> $ Event <chr> "S", "S", "s", "s", "s", "s", "s", "s", "t", "t", "t", "S~
#> $ SightNo <chr> "1406", "1407", "1407", "1407", "1407", "1407", "1407", "~
#> $ Subgroup <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SightNoDaily <chr> "20130113_1", "20130113_2", NA, NA, NA, NA, NA, NA, NA, N~
#> $ Obs <chr> "208", "125", NA, NA, NA, NA, NA, NA, "280", "149", "228"~
#> $ ObsStd <lgl> TRUE, TRUE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL~
#> $ Bearing <dbl> 309, 326, 11, 5, 50, 71, 104, 2, 120, 270, 300, 270, 45, ~
#> $ Reticle <dbl> 2.8, 0.4, 2.0, 3.5, NA, 4.5, 4.5, 2.2, NA, NA, NA, 14.0, ~
#> $ DistNm <dbl> 1.06, 2.97, 1.30, 0.90, 0.50, 0.70, 0.70, 1.30, 0.03, 0.0~
#> $ Cue <dbl> 3, 3, NA, NA, NA, NA, NA, NA, NA, NA, NA, 3, NA, 3, NA, 3~
#> $ Method <dbl> 4, 4, NA, NA, NA, NA, NA, NA, NA, NA, NA, 4, NA, 4, NA, 4~
#> $ Photos <chr> "N", "Y", NA, NA, NA, NA, NA, NA, NA, NA, NA, "N", NA, "Y~
#> $ Birds <chr> "N", "N", NA, NA, NA, NA, NA, NA, NA, NA, NA, "N", NA, "Y~
#> $ CalibSchool <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ PhotosAerial <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ Biopsy <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ Prob <lgl> FALSE, FALSE, NA, NA, NA, NA, NA, NA, NA, NA, NA, FALSE, ~
#> $ nSp <int> 1, 1, NA, NA, NA, NA, NA, NA, NA, NA, NA, 1, NA, 1, NA, 2~
#> $ Mixed <lgl> FALSE, FALSE, NA, NA, NA, NA, NA, NA, NA, NA, NA, FALSE, ~
#> $ SpCode <chr> "018", "076", NA, NA, NA, NA, NA, NA, "LV", "DC", "DC", "~
#> $ SpCodeProb <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSchoolBest <dbl> NA, 8.00000, NA, NA, NA, NA, NA, NA, 1.00000, 2.00000, 1.~
#> $ GsSchoolHigh <dbl> NA, 14.00, NA, NA, NA, NA, NA, NA, NA, NA, NA, 20.00, NA,~
#> $ GsSchoolLow <dbl> 42.333333, 5.666667, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ GsSpBest <dbl> NA, 8.00000, NA, NA, NA, NA, NA, NA, 1.00000, 2.00000, 1.~
#> $ GsSpHigh <dbl> NA, 14.000, NA, NA, NA, NA, NA, NA, NA, NA, NA, 20.000, N~
#> $ GsSpLow <dbl> 42.333333, 5.666667, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ CourseSchool <dbl> NA, NA, NA, NA, NA, 100, 100, NA, NA, NA, NA, NA, NA, NA,~
#> $ TurtleJFR <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ TurtleAge <chr> NA, NA, NA, NA, NA, NA, NA, NA, "A", "A", "J", NA, "A", N~
#> $ TurtleCapt <chr> NA, NA, NA, NA, NA, NA, NA, NA, "N", "N", "N", NA, NA, NA~
#> $ PerpDistKm <dbl> 1.525631e+00, 3.075807e+00, 4.593917e-01, 1.452712e-01, 7~
<- das_sight(y.proc, return.format = "wide")
y.sight.wide %>%
y.sight.wide select(Event, SightNo:PerpDistKm) %>%
glimpse()
#> Rows: 22
#> Columns: 58
#> $ Event <chr> "S", "S", "s", "s", "s", "s", "s", "s", "t", "t", "t", "S~
#> $ SightNo <chr> "1406", "1407", "1407", "1407", "1407", "1407", "1407", "~
#> $ Subgroup <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SightNoDaily <chr> "20130113_1", "20130113_2", NA, NA, NA, NA, NA, NA, NA, N~
#> $ Obs <chr> "208", "125", NA, NA, NA, NA, NA, NA, "280", "149", "228"~
#> $ ObsStd <lgl> TRUE, TRUE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL~
#> $ Bearing <dbl> 309, 326, 11, 5, 50, 71, 104, 2, 120, 270, 300, 270, 45, ~
#> $ Reticle <dbl> 2.8, 0.4, 2.0, 3.5, NA, 4.5, 4.5, 2.2, NA, NA, NA, 14.0, ~
#> $ DistNm <dbl> 1.06, 2.97, 1.30, 0.90, 0.50, 0.70, 0.70, 1.30, 0.03, 0.0~
#> $ Cue <dbl> 3, 3, NA, NA, NA, NA, NA, NA, NA, NA, NA, 3, NA, 3, NA, 3~
#> $ Method <dbl> 4, 4, NA, NA, NA, NA, NA, NA, NA, NA, NA, 4, NA, 4, NA, 4~
#> $ Photos <chr> "N", "Y", NA, NA, NA, NA, NA, NA, NA, NA, NA, "N", NA, "Y~
#> $ Birds <chr> "N", "N", NA, NA, NA, NA, NA, NA, NA, NA, NA, "N", NA, "Y~
#> $ CalibSchool <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ PhotosAerial <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ Biopsy <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ Prob <lgl> FALSE, FALSE, NA, NA, NA, NA, NA, NA, NA, NA, NA, FALSE, ~
#> $ nSp <int> 1, 1, NA, NA, NA, NA, NA, NA, NA, NA, NA, 1, NA, 1, NA, 2~
#> $ Mixed <lgl> FALSE, FALSE, NA, NA, NA, NA, NA, NA, NA, NA, NA, FALSE, ~
#> $ ObsEstimate <list> <"280", "001", "208">, <"280", "001", "125">, <NULL>, <N~
#> $ SpCode1 <chr> "018", "076", NA, NA, NA, NA, NA, NA, NA, NA, NA, "037", ~
#> $ SpCode2 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCode3 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCode4 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb1 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb2 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb3 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb4 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpPerc1 <dbl> 100, 100, NA, NA, NA, NA, NA, NA, NA, NA, NA, 100, NA, 10~
#> $ SpPerc2 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpPerc3 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpPerc4 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSchoolBest <dbl> NA, 8.00000, NA, NA, NA, NA, NA, NA, NA, NA, NA, 10.66667~
#> $ GsSchoolHigh <dbl> NA, 14.00, NA, NA, NA, NA, NA, NA, NA, NA, NA, 20.00, NA,~
#> $ GsSchoolLow <dbl> 42.333333, 5.666667, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ GsSpBest1 <dbl> NA, 8.00000, NA, NA, NA, NA, NA, NA, NA, NA, NA, 10.66667~
#> $ GsSpBest2 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpBest3 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpBest4 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpHigh1 <dbl> NA, 14.000, NA, NA, NA, NA, NA, NA, NA, NA, NA, 20.000, N~
#> $ GsSpHigh2 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpHigh3 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpHigh4 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpLow1 <dbl> 42.333333, 5.666667, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ GsSpLow2 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpLow3 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSpLow4 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ CourseSchool <dbl> NA, NA, NA, NA, NA, 100, 100, NA, NA, NA, NA, NA, NA, NA,~
#> $ TurtleSp <chr> NA, NA, NA, NA, NA, NA, NA, NA, "LV", "DC", "DC", NA, "DC~
#> $ TurtleGs <dbl> NA, NA, NA, NA, NA, NA, NA, NA, 1, 2, 1, NA, 1, NA, NA, N~
#> $ TurtleJFR <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ TurtleAge <chr> NA, NA, NA, NA, NA, NA, NA, NA, "A", "A", "J", NA, "A", N~
#> $ TurtleCapt <chr> NA, NA, NA, NA, NA, NA, NA, NA, "N", "N", "N", NA, NA, NA~
#> $ PinnipedSp <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ PinnipedGs <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ BoatType <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ BoatGs <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ PerpDistKm <dbl> 1.525631e+00, 3.075807e+00, 4.593917e-01, 1.452712e-01, 7~
<- das_sight(y.proc, return.format = "complete")
y.sight.complete %>%
y.sight.complete select(Event, SightNo:PerpDistKm) %>%
glimpse()
#> Rows: 37
#> Columns: 46
#> $ Event <chr> "S", "S", "S", "S", "S", "S", "s", "s", "s", "s", "s", "s~
#> $ SightNo <chr> "1406", "1406", "1406", "1407", "1407", "1407", "1407", "~
#> $ Subgroup <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SightNoDaily <chr> "20130113_1", "20130113_1", "20130113_1", "20130113_2", "~
#> $ Obs <chr> "208", "208", "208", "125", "125", "125", NA, NA, NA, NA,~
#> $ ObsStd <lgl> TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, FALSE, FALSE, FALSE, ~
#> $ Bearing <dbl> 309, 309, 309, 326, 326, 326, 11, 5, 50, 71, 104, 2, 120,~
#> $ Reticle <dbl> 2.8, 2.8, 2.8, 0.4, 0.4, 0.4, 2.0, 3.5, NA, 4.5, 4.5, 2.2~
#> $ DistNm <dbl> 1.06, 1.06, 1.06, 2.97, 2.97, 2.97, 1.30, 0.90, 0.50, 0.7~
#> $ Cue <dbl> 3, 3, 3, 3, 3, 3, NA, NA, NA, NA, NA, NA, NA, NA, NA, 3, ~
#> $ Method <dbl> 4, 4, 4, 4, 4, 4, NA, NA, NA, NA, NA, NA, NA, NA, NA, 4, ~
#> $ Photos <chr> "N", "N", "N", "Y", "Y", "Y", NA, NA, NA, NA, NA, NA, NA,~
#> $ Birds <chr> "N", "N", "N", "N", "N", "N", NA, NA, NA, NA, NA, NA, NA,~
#> $ CalibSchool <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ PhotosAerial <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ Biopsy <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ Prob <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, NA, NA, NA, NA,~
#> $ nSp <int> 1, 1, 1, 1, 1, 1, NA, NA, NA, NA, NA, NA, NA, NA, NA, 1, ~
#> $ Mixed <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, NA, NA, NA, NA,~
#> $ ObsEstimate <chr> "280", "001", "208", "280", "001", "125", NA, NA, NA, NA,~
#> $ SpCode1 <chr> "018", "018", "018", "076", "076", "076", NA, NA, NA, NA,~
#> $ SpCode2 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCode3 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCode4 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb1 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb2 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb3 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpCodeProb4 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpPerc1 <dbl> 100, 100, 100, 100, 100, 100, NA, NA, NA, NA, NA, NA, NA,~
#> $ SpPerc2 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpPerc3 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ SpPerc4 <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSchoolBest <dbl> NA, NA, NA, 6, 9, 9, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ GsSchoolHigh <dbl> NA, NA, NA, 10, 10, 22, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ GsSchoolLow <dbl> 43, 36, 48, 6, 2, 9, NA, NA, NA, NA, NA, NA, NA, NA, NA, ~
#> $ CourseSchool <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, 100, 100, NA, NA, NA,~
#> $ TurtleSp <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "LV", "DC~
#> $ TurtleGs <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 1, 2, 1, ~
#> $ TurtleJFR <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ TurtleAge <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "A", "A",~
#> $ TurtleCapt <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "N", "N",~
#> $ PinnipedSp <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ PinnipedGs <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ BoatType <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ BoatGs <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N~
#> $ PerpDistKm <dbl> 1.52563078, 1.52563078, 1.52563078, 3.07580701, 3.0758070~
You can also easily filter or subset the sighting data for the desired event code(s)
<- das_sight(y.proc, return.events = c("S", "G"))
y.sight.sg
# Note that this is equivalent to:
<- das_sight(y.proc) %>% filter(Event %in% c("S", "G")) y.sight.sg2
In addition, you can chop the effort data into segments, and summarize the conditions and sightings on those segments using das_effort
and das_effort_sight
. These effort segments can be used for line transect estimates using the Distance software, species distribution modeling, or summarizing the number of sightings of certain species on each segment, among other uses. das_effort
chops continuous effort sections (the event sequence from R to E events) into effort segments using one of several different chopping methods: condition (a new effort segment every time a condition changes), equal length (effort segments of equal length), or section (each segment is a full continuous effort section, i.e. it runs from an R event to an E event). das_effort_sight
takes the output of das_effort
and returns the number of included sightings and animals per segment for specified species codes.
Both functions return a list of three data frames: segdata, sightinfo, and randpicks. These data frames and the different chopping methodologies are described in depth in the function documentation (?das_effort
and ?das_effort_sight
), but briefly segdata contains information about each effort segment, sightinfo contains information about the sightings such as their corresponding segment, and randpicks contains information specific to the ‘equal length’ chopping method. das_effort
and das_effort_sight
are separate functions to allow the user more control over which sightings should be included in the effort segment summaries (see ?das_effort
). See below for how to chop/split effort lines by strata.
# Chop the effort into 10km segments
<- das_effort(
y.eff.eq method = "equallength", seg.km = 10, dist.method = "greatcircle",
y.proc, num.cores = 1
)#> No argument was passed via randpicks.load, and thus new randpicks values will be generated
# Chop the effort every time a condition changes
<- das_effort(
y.eff method = "condition", seg.min.km = 0,
y.proc, dist.method = "greatcircle", conditions = c("Bft", "SwellHght", "Vis"),
num.cores = 1
)<- das_effort_sight(y.eff, sp.codes = c("018", "076"))
y.eff.sight
glimpse(y.eff.sight$segdata)
#> Rows: 20
#> Columns: 31
#> $ segnum <int> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16~
#> $ section_id <int> 1, 2, 2, 3, 3, 3, 3, 3, 3, 3, 4, 4, 5, 6, 6, 7, 7, 8,~
#> $ section_sub_id <dbl> 1, 1, 2, 1, 2, 3, 4, 5, 6, 7, 1, 2, 1, 1, 2, 1, 2, 1,~
#> $ file <chr> "das_sample.das", "das_sample.das", "das_sample.das",~
#> $ stlin <int> 2, 23, 33, 59, 69, 70, 75, 78, 84, 85, 99, 108, 127, ~
#> $ endlin <int> 20, 33, 43, 69, 70, 75, 78, 84, 85, 90, 108, 121, 147~
#> $ lat1 <dbl> 39.32033, 39.37617, 39.42950, 39.56800, 39.66082, 39.~
#> $ lon1 <dbl> -137.6043, -137.5978, -137.5715, -137.4530, -137.4132~
#> $ DateTime1 <dttm> 2013-01-13 06:27:39, 2013-01-13 06:58:04, 2013-01-13~
#> $ lat2 <dbl> 39.36716, 39.42950, 39.51933, 39.66082, 39.66133, 39.~
#> $ lon2 <dbl> -137.5817, -137.5715, -137.5277, -137.4132, -137.4130~
#> $ DateTime2 <dttm> 2013-01-13 06:46:25, 2013-01-13 07:20:02, 2013-01-13~
#> $ mlat <dbl> 39.34377, 39.40288, 39.47435, 39.61433, 39.66108, 39.~
#> $ mlon <dbl> -137.5930, -137.5848, -137.5493, -137.4327, -137.4131~
#> $ mDateTime <dttm> 2013-01-13 06:37:02, 2013-01-13 07:09:03, 2013-01-13~
#> $ dist <dbl> 5.5577, 6.3431, 10.6674, 10.8651, 0.0574, 1.4189, 1.9~
#> $ year <dbl> 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013,~
#> $ month <dbl> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,~
#> $ day <int> 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 1~
#> $ mtime <chr> "06:37:02", "07:09:03", "07:38:33", "09:40:55", "09:5~
#> $ Cruise <dbl> 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000,~
#> $ Mode <chr> "C", "C", "C", "C", "C", "C", "C", "C", "C", "C", "C"~
#> $ EffType <chr> "S", "S", "S", "S", "S", "S", "S", "S", "S", "S", "S"~
#> $ ESWsides <dbl> 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,~
#> $ maxdistBft <dbl> 3, 3, 3, 3, 3, 2, 2, 2, 2, 2, 3, 3, 3, 3, 2, 3, 2, 2,~
#> $ maxdistSwellHght <dbl> 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3,~
#> $ maxdistVis <dbl> 6.0, 6.0, 5.5, 5.5, 6.0, 6.0, 5.5, 4.5, 3.5, 2.5, 5.8~
#> $ nSI_018 <dbl> 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,~
#> $ ANI_018 <dbl> 42.33333, 0.00000, 0.00000, 0.00000, 0.00000, 0.00000~
#> $ nSI_076 <dbl> 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,~
#> $ ANI_076 <dbl> 0, 0, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,~
glimpse(y.eff.sight$sightinfo)
#> Rows: 13
#> Columns: 64
#> $ segnum <int> 1, 3, 4, 13, 13, 15, 17, 18, 18, 20, 20, 20, 20
#> $ mlat <dbl> 39.34377, 39.47435, 39.61433, 40.20895, 40.20895, 40.3483~
#> $ mlon <dbl> -137.5930, -137.5493, -137.4327, -137.1531, -137.1531, -1~
#> $ Event <chr> "S", "S", "t", "t", "S", "t", "S", "S", "S", "S", "S", "S~
#> $ DateTime <dttm> 2013-01-13 06:46:02, 2013-01-13 07:56:22, 2013-01-13 09:3~
#> $ year <dbl> 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 20~
#> $ Lat <dbl> 39.36617, 39.51767, 39.59733, 40.18283, 40.26567, 40.3605~
#> $ Lon <dbl> -137.5820, -137.5285, -137.4400, -137.1622, -137.1350, -1~
#> $ OnEffort <lgl> TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRU~
#> $ Cruise <dbl> 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 100~
#> $ Mode <chr> "C", "C", "C", "C", "C", "C", "C", "C", "C", "C", "C", "C~
#> $ OffsetGMT <int> 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5
#> $ EffType <chr> "S", "S", "S", "S", "S", "S", "S", "S", "S", "S", "S", "S~
#> $ ESWsides <dbl> 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2
#> $ Course <dbl> 25, 26, 27, 20, 20, 16, 25, 30, 30, 23, 23, 23, 23
#> $ SpdKt <dbl> 10.2, 9.7, 9.0, 9.3, 9.3, 8.9, 8.9, 9.5, 9.5, 9.6, 9.6, 9~
#> $ Bft <dbl> 3, 3, 3, 3, 3, 2, 2, 2, 2, 2, 2, 2, 2
#> $ SwellHght <dbl> 3, 3, 3, 3, 3, 3, 3, 3, 3, 1, 1, 1, 1
#> $ WindSpdKt <dbl> 10, 10, 10, 6, 6, 6, 6, 6, 6, 5, 5, 5, 5
#> $ RainFog <dbl> 1, 3, 1, 1, 1, 1, 1, 1, 1, 3, 3, 3, 3
#> $ HorizSun <dbl> NA, 2, 2, 8, 8, 9, 8, 8, 8, NA, NA, NA, NA
#> $ VertSun <dbl> NA, 2, 2, 1, 1, 1, 2, 2, 2, NA, NA, NA, NA
#> $ Glare <lgl> NA, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALS~
#> $ Vis <dbl> 6.0, 5.5, 5.5, 6.0, 6.0, 6.0, 6.0, 6.0, 6.0, 4.0, 4.0, 4.~
#> $ ObsL <chr> "208", "125", "001", "280", "280", "125", "149", "126", "~
#> $ Rec <chr> "280", "208", "126", "001", "001", "208", "125", "149", "~
#> $ ObsR <chr> "001", "280", "149", "126", "126", "280", "208", "125", "~
#> $ ObsInd <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ EffortDot <lgl> TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRU~
#> $ EventNum <chr> "15", "35", "59", "131", "136", "153", "167", "181", "181~
#> $ file_das <chr> "das_sample.das", "das_sample.das", "das_sample.das", "da~
#> $ line_num <int> 15, 38, 65, 137, 142, 162, 176, 193, 193, 248, 248, 252, ~
#> $ SightNo <chr> "1406", "1407", NA, NA, "1408", NA, "1409", "1410", "1410~
#> $ Subgroup <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ SightNoDaily <chr> "20130113_1", "20130113_2", NA, NA, "20130113_3", NA, "20~
#> $ Obs <chr> "208", "125", "280", "228", "280", "231", "149", "125", "~
#> $ ObsStd <lgl> TRUE, TRUE, FALSE, FALSE, TRUE, FALSE, TRUE, TRUE, TRUE, ~
#> $ Bearing <dbl> 309, 326, 120, 300, 270, 45, 344, 70, 70, 359, 359, 38, 38
#> $ Reticle <dbl> 2.8, 0.4, NA, NA, 14.0, NA, 0.2, 1.4, 1.4, 0.3, 0.3, 0.8,~
#> $ DistNm <dbl> 1.06, 2.97, 0.03, 0.02, 0.28, 0.05, 3.68, 1.66, 1.66, 3.2~
#> $ Cue <dbl> 3, 3, NA, NA, 3, NA, 3, 3, 3, 2, 2, 3, 3
#> $ Method <dbl> 4, 4, NA, NA, 4, NA, 4, 4, 4, 4, 4, 4, 4
#> $ Photos <chr> "N", "Y", NA, NA, "N", NA, "Y", "Y", "Y", "Y", "Y", "Y", ~
#> $ Birds <chr> "N", "N", NA, NA, "N", NA, "Y", "N", "N", "N", "N", "N", ~
#> $ CalibSchool <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ PhotosAerial <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ Biopsy <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ Prob <lgl> FALSE, FALSE, NA, NA, FALSE, NA, FALSE, FALSE, FALSE, FAL~
#> $ nSp <int> 1, 1, NA, NA, 1, NA, 1, 2, 2, 2, 2, 2, 2
#> $ Mixed <lgl> FALSE, FALSE, NA, NA, FALSE, NA, FALSE, TRUE, TRUE, TRUE,~
#> $ SpCode <chr> "018", "076", "LV", "DC", "037", "DC", "016", "013", "016~
#> $ SpCodeProb <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, "016", "016"
#> $ GsSchoolBest <dbl> NA, 8.00000, 1.00000, 1.00000, 10.66667, 1.00000, 46.6666~
#> $ GsSchoolHigh <dbl> NA, 14.00, NA, NA, 20.00, NA, 79.00, 72.75, 72.75, 249.00~
#> $ GsSchoolLow <dbl> 42.333333, 5.666667, NA, NA, 10.666667, NA, 46.666667, 41~
#> $ GsSpBest <dbl> NA, 8.00000, 1.00000, 1.00000, 10.66667, 1.00000, 46.6666~
#> $ GsSpHigh <dbl> NA, 14.000, NA, NA, 20.000, NA, 79.000, 53.165, 19.585, 2~
#> $ GsSpLow <dbl> 42.333333, 5.666667, NA, NA, 10.666667, NA, 46.666667, 30~
#> $ CourseSchool <dbl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ TurtleJFR <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ TurtleAge <chr> NA, NA, "A", "J", NA, "A", NA, NA, NA, NA, NA, NA, NA
#> $ TurtleCapt <chr> NA, NA, "N", "N", NA, NA, NA, NA, NA, NA, NA, NA, NA
#> $ PerpDistKm <dbl> 1.52563078, 3.07580701, 0.04811637, 0.03207758, 0.5185600~
#> $ included <lgl> TRUE, TRUE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL~
This package contains several methods to incorporate strata into your DAS data processing: das_intersects_strata
for assigning points to strata and das_effort
for chopping effort lines by strata. These functions contain a ‘strata.files’ argument, which expects a (named) list of paths to CSV files; names are automatically generated if not provided. These files must have headers, longitude values in column one and latitude values in column two, and be closed polygons.
First, das_intersects_strata
allows you to add columns to data frames indicating if a point intersects one or more strata polygons. You can pass this function either a data frame or a list. If a list then it must be the output of das_effort
or das_effort_sight
, and the function will use the segment midpoints to determine if a segment (and associated sightings) intersected each stratum.
<- system.file("das_sample_stratum.csv", package = "swfscDAS")
stratum.file <- das_intersects_strata(y.eff.sight, list(InPoly = stratum.file))
y.eff.sight.strata
glimpse(y.eff.sight.strata$segdata)
#> Rows: 20
#> Columns: 32
#> $ segnum <int> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16~
#> $ section_id <int> 1, 2, 2, 3, 3, 3, 3, 3, 3, 3, 4, 4, 5, 6, 6, 7, 7, 8,~
#> $ section_sub_id <dbl> 1, 1, 2, 1, 2, 3, 4, 5, 6, 7, 1, 2, 1, 1, 2, 1, 2, 1,~
#> $ file <chr> "das_sample.das", "das_sample.das", "das_sample.das",~
#> $ stlin <int> 2, 23, 33, 59, 69, 70, 75, 78, 84, 85, 99, 108, 127, ~
#> $ endlin <int> 20, 33, 43, 69, 70, 75, 78, 84, 85, 90, 108, 121, 147~
#> $ lat1 <dbl> 39.32033, 39.37617, 39.42950, 39.56800, 39.66082, 39.~
#> $ lon1 <dbl> -137.6043, -137.5978, -137.5715, -137.4530, -137.4132~
#> $ DateTime1 <dttm> 2013-01-13 06:27:39, 2013-01-13 06:58:04, 2013-01-13~
#> $ lat2 <dbl> 39.36716, 39.42950, 39.51933, 39.66082, 39.66133, 39.~
#> $ lon2 <dbl> -137.5817, -137.5715, -137.5277, -137.4132, -137.4130~
#> $ DateTime2 <dttm> 2013-01-13 06:46:25, 2013-01-13 07:20:02, 2013-01-13~
#> $ mlat <dbl> 39.34377, 39.40288, 39.47435, 39.61433, 39.66108, 39.~
#> $ mlon <dbl> -137.5930, -137.5848, -137.5493, -137.4327, -137.4131~
#> $ mDateTime <dttm> 2013-01-13 06:37:02, 2013-01-13 07:09:03, 2013-01-13~
#> $ dist <dbl> 5.5577, 6.3431, 10.6674, 10.8651, 0.0574, 1.4189, 1.9~
#> $ year <dbl> 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013,~
#> $ month <dbl> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,~
#> $ day <int> 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 13, 1~
#> $ mtime <chr> "06:37:02", "07:09:03", "07:38:33", "09:40:55", "09:5~
#> $ Cruise <dbl> 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000,~
#> $ Mode <chr> "C", "C", "C", "C", "C", "C", "C", "C", "C", "C", "C"~
#> $ EffType <chr> "S", "S", "S", "S", "S", "S", "S", "S", "S", "S", "S"~
#> $ ESWsides <dbl> 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2,~
#> $ maxdistBft <dbl> 3, 3, 3, 3, 3, 2, 2, 2, 2, 2, 3, 3, 3, 3, 2, 3, 2, 2,~
#> $ maxdistSwellHght <dbl> 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3,~
#> $ maxdistVis <dbl> 6.0, 6.0, 5.5, 5.5, 6.0, 6.0, 5.5, 4.5, 3.5, 2.5, 5.8~
#> $ nSI_018 <dbl> 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,~
#> $ ANI_018 <dbl> 42.33333, 0.00000, 0.00000, 0.00000, 0.00000, 0.00000~
#> $ nSI_076 <dbl> 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,~
#> $ ANI_076 <dbl> 0, 0, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,~
#> $ InPoly <dbl> 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1,~
In addition, you can chop/split effort lines by strata using the ‘strata.files’ argument of das_effort
. Effort lines are chopped/split by strata before then being processed using the specified method (condition, equal length, etc.). The ‘segdata’ element of the das_effort
output will contain a ‘stratum’ column, which contains the name of the stratum that the segment is in. See ?das_effort
and ?das_effort_strata
for more details.
<- das_effort(
y.eff.strata.section method = "section", strata.files = list(stratum.file),
y.proc, num.cores = 1
)#> although coordinates are longitude/latitude, st_intersection assumes that they are planar
<- das_effort(
y.eff.strata.condition method = "condition", seg.min.km = 0,
y.proc, strata.files = list(Poly1 = stratum.file),
num.cores = 1
)#> although coordinates are longitude/latitude, st_intersection assumes that they are planar
glimpse(y.eff.strata.section$segdata)
#> Rows: 10
#> Columns: 33
#> $ segnum <int> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
#> $ section_id <int> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
#> $ section_sub_id <dbl> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1
#> $ file <chr> "das_sample.das", "das_sample.das", "das_sample.das", "~
#> $ stlin <int> 2, 23, 59, 99, 127, 150, 167, 188, 232, 242
#> $ endlin <int> 20, 43, 90, 121, 147, 164, 181, 199, 240, 259
#> $ lat1 <dbl> 39.32033, 39.37617, 39.56800, 39.94517, 40.15217, 40.26~
#> $ lon1 <dbl> -137.6043, -137.5978, -137.4530, -137.3692, -137.1737, ~
#> $ DateTime1 <dttm> 2013-01-13 06:27:39, 2013-01-13 06:58:04, 2013-01-13 09~
#> $ lat2 <dbl> 39.36716, 39.51933, 39.75433, 40.12745, 40.26617, 40.37~
#> $ lon2 <dbl> -137.5817, -137.5277, -137.4107, -137.2488, -137.1348, ~
#> $ DateTime2 <dttm> 2013-01-13 06:46:25, 2013-01-13 07:57:05, 2013-01-13 10~
#> $ mlat <dbl> 39.34377, 39.44767, 39.66117, 40.03679, 40.20895, 40.32~
#> $ mlon <dbl> -137.5930, -137.5625, -137.4131, -137.3101, -137.1531,~
#> $ mDateTime <dttm> 2013-01-13 06:37:02, 2013-01-13 07:27:34, 2013-01-13 09~
#> $ dist <dbl> 5.5577, 17.0106, 21.8086, 22.7090, 13.0922, 12.3011, 8.~
#> $ year <dbl> 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, 2013, ~
#> $ month <dbl> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1
#> $ day <int> 13, 13, 13, 13, 13, 13, 13, 13, 14, 14
#> $ mtime <chr> "06:37:02", "07:27:34", "09:59:20", "12:34:14", "14:14:~
#> $ Cruise <dbl> 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1000, 1~
#> $ Mode <chr> "C", "C", "C", "C", "C", "C", "C", "C", "C", "C"
#> $ EffType <chr> "S", "S", "S", "S", "S", "S", "S", "S", "S", "S"
#> $ ESWsides <dbl> 2, 2, 2, 2, 2, 2, 2, 2, 2, 2
#> $ avgBft <dbl> 3.000000, 3.000000, 2.500835, 3.000000, 3.000000, 2.47~
#> $ avgSwellHght <dbl> 3, 3, 3, 3, 3, 3, 3, 3, 1, 1
#> $ avgHorizSun <dbl> 2.000000, 2.000000, 2.000000, 12.000000, 8.000000, 8.52~
#> $ avgVertSun <dbl> 3, 2, 2, 12, 1, 1, 2, 2, NA, NA
#> $ avgGlare <dbl> 0, 0, 0, 0, 0, 0, 0, 0, NA, NA
#> $ avgVis <dbl> 6.000000, 5.686447, 5.090748, 5.974103, 6.000000, 6.000~
#> $ avgCourse <dbl> 23.56198, 27.11868, 95.65614, 34.61155, 20.43919, 16.47~
#> $ avgSpdKt <dbl> 9.912395, 9.476263, 9.287725, 9.428340, 9.343919, 9.091~
#> $ stratum <chr> NA, NA, NA, NA, "Stratum1", "Stratum1", "Stratum1", "St~
Comments
In addition, you can use
das_comments
to generate comment strings. This is particularly useful if looking for comments with keywords