Data-flo Docs
  • About Data-flo
    • Data-flo Major Update in April 2024
      • Migrating your workflows into the latest version of Data-flo
        • Streamlined adaptors
        • Deprecated adaptors
        • New and retained adaptors
    • Change Log
    • Privacy and Terms of Service
    • Open source software used by Data-flo
  • Data-flo Basics
    • Account
    • Navigation
    • Terminology
    • Interface Icons
    • Data-flo's building blocks: Adaptors
      • Using adaptors to import data
      • Using adaptors to process data
      • Using adaptors to export data
    • Combining adaptors to create workflows
      • Creating a workflow
        • Building a workflow from scratch
        • Cloning an existing workflow
        • Importing a .dataflo file
      • Testing your workflows
      • Running your workflows
      • Accessing your workflows
  • Adaptor reference guide
    • add-column
    • add-jittering
    • add-value-to-dictionary
    • aggregate-rows
    • append-datatables
    • append-lists
    • append-to-list
    • apply-force-directed-layout
    • calculate-column
    • calculate-time-difference
    • change-column-case
    • compare-columns
    • concatenate-columns
    • concatenate-text
    • convert-date-to-text
    • convert-list-to-datatable
    • convert-text-to-datatable
    • convert-text-to-list
    • create-dictionary-from-datatable
    • create-google-drive-folder
    • create-graph-from-datatable
    • create-graph-from-dot
    • create-list-from-datatable
    • create-text-from-template
    • duplicate-column
    • export-file-to-google-drive
    • export-file-to-smb-share
    • export-graph-to-dot-file
    • export-text-to-file
    • export-to-csv-file
    • export-to-dbf-file
    • export-to-google-sheet
    • export-to-microreact-project
    • export-to-sqlite-file
    • filter-list
    • filter-rows
    • find-value-in-dictionary
    • find-value-in-list
    • format-date-column
    • format-time-column
    • geocoding
    • import-file-from-dropbox
    • import-file-from-figshare
    • import-file-from-google-drive
    • import-file-from-http-request
    • import-file-from-s3
    • import-file-from-smb-share
    • import-file-from-url
    • import-from-csv-file
    • import-from-dbf-file
    • import-from-epicollect-project
    • import-from-excel-file
    • import-from-google-sheet
    • import-from-json-file
    • import-from-microreact-project
    • import-from-mysql
    • import-from-oracle
    • import-from-postgres
    • import-from-spreadsheet-file
    • import-from-sql-server
    • import-from-sqlite
    • import-list-from-text-file
    • import-text-from-file
    • join-datatables
    • list-datatable-columns
    • list-newick-leaf-labels
    • map-column-values
    • prepend-to-list
    • query-datatable
    • remove-columns
    • remove-duplicate-list-values
    • remove-duplicate-rows
    • rename-columns
    • rename-newick-leaf-labels
    • replace-blank-values
    • replace-values-in-columns
    • replace-values-in-list
    • replace-values-in-text
    • reshape-long-to-wide
    • reshape-wide-to-long
    • reverse-geocoding
    • run-openai-model
    • run-replicate-model
    • run-workflow
    • sample-datatable
    • select-columns
    • select-list-values
    • select-rows
    • send-email-message
    • sort-datatable
    • sort-list
    • split-column
    • split-geographical-coordinates
    • split-list
    • summarise-datatable
    • transform-columns
    • workflow-repeater
  • Applying Data-flo
    • Basics in Minutes!
      • Quick Workflow
        • Step 1: Configure a solo adapter to view data.
        • Step 2: Add and link a second adaptor.
        • Step 3: Add a value.
        • Step 4: Complete the workflow.
        • Step 5: Run the workflow.
        • Step 6: Share the workflow.
  • API
    • Data-flo API
    • API Access Tokens
  • Support
    • Contact and Feedback
    • Private Installations
Powered by GitBook
On this page
  • Bringing data into Data-flo
  • Import adaptors
  1. Data-flo Basics
  2. Data-flo's building blocks: Adaptors

Using adaptors to import data

PreviousData-flo's building blocks: AdaptorsNextUsing adaptors to process data

Last updated 1 year ago

Bringing data into Data-flo

Data are brought into Data-flo via Import adaptors, which import data from specific sources and/or file types.

Data sources include: Dropbox, Figshare, GoogleDrive, http request, Amazon S3 server, SMB/CIFS shared network drive, website url, MySQL database, Oracle, PostgreSQL, SQL server database, Microreact, and EpiCollect. You can also manually select and add data files from your computer such as csv, tsv, dbf, spreadsheet, and JSON files.

Import adaptors

Import adaptors require an input source (database, url, etc.) or input file (csv, spreadsheet, etc.), enabling the user to point to a specific data source/file when the workflow is run.

Workflow creators and editors can embed a specific file directly into the workflow or can allow users to upload a file type or data source at run time.

Files uploaded on the Run page

One ore more files can be marked as inputs and will be added on the run page by the end user.

Files embedded into the workflow

Alternatively, files can be embedded directly into the import adaptors. In this case, those running the workflow will not see these files.

Incorporate embedded files into the workflow if these file contents rarely change or a specific file is required to run the workflow.

Run page: front end view for users
Canvas: backend view for worklow managers
A screenshot of the run page of a workflow. This is the front-end view for users who wish to use the workflow outputs. The page consists of text boxes each representing a workflow input. Some text boxes will require a file to be uploaded, others may require text.
A screenshot showing users how to embed files into a Data-flo workflow. The input option field is called "data". Once a user clicks on this field, the side bar menu will display options for the user to either create a Data-flo input (a file to be updated once the workflow is run by other users) or Define a value for the file in which the file is embedded in the workflow for future use.