Data reduction for paleomagnetic data aboard the JOIDES Resolution#
This notebook is for people wanting to download and manipulate data from an IODP Expedition using data in the LIMS Online Repository. The basic procedure is as follows. This notebook will guide you through the process step-by-step.
Import the required python packages and set up the desired directory structure for each HOLE.
Download the Section Summary table and put the .csv file in the HOLE working directory
Download the Sample Report table for any discrete samples taken (select Sample Type - CUBE and Test code PMAG) and put them in the HOLE working directory
Download the SRM data for the hole and put them in the SRM working directory in the HOLE working directory
Download the SRM discrete measurement data for the hole and put them in the SRM_discrete sample working directory.
Download the JR6 data and put them in the JR6 working directory
Download the KLY4S data and put them in the KLY4S working directory.
If you want to edit the archive half data for coring or lithologic disturbances:
Download the core disturbance info from Desklogic
go to the Desklogic computer station and open DeskLogic (little yellow square)
select the ‘macroscopic’ template
select sample: hole, Archive, section half
download
export: include classification, Data and save on the Desktop
if the network drives are not available, right click on the double square icon on bar
switch the login profile to scientist and login with OES credentials (user name/email password)
choose scientist and map the drives. they should be now available to windows explorer
copy to your HOLE working directory.
If you want to use Xray information to edit your archive half data: fill in the XRAY disturbance summary file and put it into the HOLE working directory [template is in data_files/iodp_magic]
To start processing data from a single HOLE:
Duplicate this notebook with the HOLE name (e.g., U999A) by first making a copy, then renaming it under the ‘File’ menu. Follow the instructions below.
For HELP:
- for help on options in any python function, you can type:
help(MODULE.FUNCTION) in the cell below. For example click on the cell below for information on how to use convert.iodp_samples_csv. This works for any python function.
- email ltauxe@ucsd.edu. Incluede a description of your problem, a screen shot of the error if appropriate and an example data file that is causing the difficulty (be mindful of embargo issues).
For a worked example (using FAKE data, see data_files/iodp_magic/U999A.ipynb)
Table of Contents#
Preliminaries : import required packages, set up directory structure and set file names
Make sample tables : parse downloaded sample table to standard format
Archive measurements : parse downloaded srm archive measurements and perform editing functions
Discrete sample measurements : parse downloaded discrete sample data
Downhole plots : make downhole plots of remanence
Anisotropy of magnetic susceptibility : plots of AMS
Prepare files for uploading to MagIC : you can upload data to the MagIC database as a private contribution. Then when you have published your data, you can activate your contribution with the DOI of your publication.
Preliminaries#
In the cell below, edit the HOLE name and set the HOLE latitude and longitude where HOLE stands for the hole name. The cell below will do this for you.
edit the file names as you download them from LORE
Every time you open this notebook, you must click on the cell below and then click ‘Run’ in the menu above to execute it.
# import a bunch of packages for use in the notebook
import pmagpy.pmag as pmag # a bunch of PmagPy modules
import pmagpy.pmagplotlib as pmagplotlib
import pmagpy.ipmag as ipmag
import pmagpy.contribution_builder as cb
from pmagpy import convert_2_magic as convert # conversion scripts for many lab formats
from pmagpy import iodp_funcs as iodp_funcs # functions for dealing with LIMS data
import matplotlib.pyplot as plt # our plotting buddy
import numpy as np # the fabulous NumPy package
import pandas as pd # and of course Pandas
%matplotlib inline
from importlib import reload
import warnings
warnings.filterwarnings("ignore")
meas_files,spec_files,samp_files,site_files=[],[],[],[] # holders for magic file names
import os
# Modify these for your expedition
exp_name,exp_description="","" # e.g., 'IODP Expedition 382','Iceberg Alley'
# Edit these for each hole. In the following, HOLE refers to the current hole.
hole="" # e.g., U999A
# edit these for the current hole - get it from Hole summary on LORE
hole_lat,hole_lon=0+0/60,0+0/60 # e.g., -57+26.5335/60,-43+21.4723/60
gad_inc=pmag.pinc(hole_lat) # geocentric axial dipole inclination for hole
demag_step=0.015 # choose the demagnetization step (in tesla) for downhole plots
# set up the directory structure for this hole
# these are default locations and variables - do not change them.
jr6_dir=hole+'/JR6_data'
kly4s_dir=hole+'/KLY4S_data'
srm_archive_dir=hole+'/SRM_archive_data'
srm_discrete_dir=hole+'/SRM_discrete_data'
magic_dir=hole+'/'+hole+'_MagIC'
dscr_file="" # if no discrete samples, this stays blank
# set up the directory structure
if hole: # checks if the hole name has been set.
if hole not in os.listdir(): # checks if the directory structure exists, otherwise this will create it
os.mkdir(hole)
os.mkdir(jr6_dir)
os.mkdir(kly4s_dir)
os.mkdir(srm_archive_dir)
os.mkdir(srm_discrete_dir)
os.mkdir(magic_dir)
os.mkdir('Figures')
#After mkdir has been run, you can
# 1) download all the files you need
# 2) put them in the correct folders as instructed below. Set the file names here:
# 3) edit the file names for the downloaded files.
# Put these downloaded file into the HOLE directory (after mkdir has been executed)
# required for downhole direction plot. Put the file in the HOLE directory
section_summary_file="" # set this to the downloaded summary file (e.g., "Section Summary_15_5_2019.csv")
if section_summary_file: # add the path to the summary file, if set.
summary_file=hole+'/'+section_summary_file
# required for downhole AMS plot. Put the file in the HOLE directory
core_summary_file="" # set this to the downloaded Core summary file (e.g., "Core Summary_15_4_2019.csv")
# required for unpacking LIMS discrete sample data. Put the file in the HOLE directory
samp_file="" # set this to your sample file (e.g., samp_file='samples_5_4_2019.csv' )
# Put these files into the designated subdirectories in the HOLE directory:
# required for unpacking SRM archive measurements file. It should be in HOLE/SRM_archive_data/
srm_archive_file="" # set this to your srm archive file from LIMS (e.g., "srmsection_1_4_2019.csv")
# required for unpacking SRM discrete measurements file. It should be in HOLE/SRM_discrete_data/
srm_discrete_file= '' # SRM discrete measurements file (e.g., "srmdiscrete_7_4_2019.csv")
# required for unpacking SRM discrete measurements with OFFLINE TREATMENTS. Put in: HOLE/SRM_archive_data/
# NB: you also need the srm_discrete_file set
dscr_ex_file='' # SRM discrete extended file (e.g., "ex-srm_26_4_2019.csv")
# needed for unpacking JR-6 measurement data. Put the file in HOLE/JR6_data/ directory
jr6_file='' # JR6 data file from LORE (e.g., "spinner_1_4_2019.csv")
# required for unpackying Kappabridge data. be sure to download the expanded file from LORE.
# Put it in HOLE/KLY4S_data/
kly4s_file='' # set this to the downloaded file (e.g., "ex-kappa_15_4_2019.csv")
Make the sample tables#
Download the Sample Report and Section Summary tables (when available) for the HOLE from the LIMS online repository at http://web.ship.iodp.tamu.edu/LORE/ and save them as csv files.
Put the two .csv files in the HOLE directory created above.
Edit the name of the sample .csv file in prelimaries cell and the ‘secondary depth’ column that you selected when downloading (the default is CSF-B as below) in the cell below.
Execute the cell to populate the MagIC meta-data tables. The depth information ends up in the lims_sites.txt table in the HOLE_MagIC directory, for example and gets used to create downhole plots.
Run the following cell to create the samples, sites, and locations tables for the hole.
# Make sure the sample file is in your HOLE directory and the filename set in the Preliminaries
# Note: this program will not run if the file is in use
if samp_file:
comp_depth_key='Top depth CSF-B (m)'
# do the heavy lifting:
convert.iodp_samples_csv(samp_file,input_dir_path=hole,spec_file='lims_specimens.txt',\
samp_file='lims_samples.txt',site_file='lims_sites.txt',\
dir_path=magic_dir,comp_depth_key=comp_depth_key,\
exp_name=exp_name,exp_desc=exp_description,lat=hole_lat,\
lon=hole_lon)
# this collects the file names that were created so they can be combined with others, e.g., those
# from the archive half measurements which are not in the sample table.
if 'lims_specimens.txt' not in spec_files:spec_files.append('lims_specimens.txt')
if 'lims_samples.txt' not in samp_files:samp_files.append('lims_samples.txt')
if 'lims_sites.txt' not in site_files:site_files.append('lims_sites.txt')
# do it again to make copies for use with demag_gui
convert.iodp_samples_csv(samp_file,input_dir_path=hole,\
dir_path=magic_dir,comp_depth_key=comp_depth_key,\
exp_name=exp_name,exp_desc=exp_description,lat=hole_lat,\
lon=hole_lon)
Convert the SRM archive half data for the Hole#
download data for a single hole from LORE as a csv file. edit the file name in the preliminaries cell
put the file into the HOLE/SRM_archive_data directory in the HOLE directory
edit the composite depth column header (comp_depth_key) if desired.
if you have been busy measuring archives, writing the measurement files can take awhile so be patient.
# Fill in the name of your srm archive half measruement file (e.g., samp_file='samples_5_4_2019.csv' )
# Make sure the archive measurement file is in your HOLE directory.
# Note: this program will not run if the file is in use
if srm_archive_file:
comp_depth_key='Depth CSF-B (m)'
convert.iodp_srm_lore(srm_archive_file,meas_file='srm_arch_measurements.txt', comp_depth_key=comp_depth_key,\
dir_path=magic_dir,input_dir_path=srm_archive_dir,lat=hole_lat,lon=hole_lon)
if 'srm_arch_measurements.txt' not in meas_files:meas_files.append('srm_arch_measurements.txt')
if 'srm_arch_specimens.txt' not in spec_files:spec_files.append('srm_arch_specimens.txt')
if 'srm_arch_samples.txt' not in samp_files:samp_files.append('srm_arch_samples.txt')
if 'srm_arch_sites.txt' not in site_files:site_files.append('srm_arch_sites.txt')
Editing of SRM archive data.
filter for desired demag step (set in the preliminaries cell).
remove data from within 80 cm of core tops and 10 cm from section ends.
if desired: (set nodist=True), remove from “disturbed” intervals labled “high” from DescLogic.
go to DescLogic and export the list of disturbances for the hole.
put this in HOLE_disturbances.xlsx in the HOLE directory. note that HOLE is your hole name, set in the “preliminaries” cell.
remove data from disturbed intervals based on the Xrays. You have to create the Xray data file yourself. There is a template for this that must be followed in PmagPy/data_files/iodp_magic
adjust the data to average normal dec=90
# to execute this cell, set False to True. turn it back to False so you don't rerun this by accident.
remove_ends=True
remove_desclogic_disturbance=False
remove_xray_disturbance=False
core_top=80 # remove top 80 cm of core top - change as desired
section_ends=10 # remove 10 cm from either end of the section - change as desired
if False:
arch_demag_step=iodp_funcs.demag_step(magic_dir,hole,demag_step) # pick the demag step
if remove_ends:
noends=iodp_funcs.remove_ends(arch_demag_step,hole,\
core_top=core_top,section_ends=section_ends) # remove the ends
else:
noends=arch_demag_step
if remove_desclogic_disturbance:
nodist=iodp_funcs.remove_disturbance(noends,hole) # remove coring disturbances
else:
nodist=noends
if remove_xray_disturbance:
no_xray_df=iodp_funcs.no_xray_disturbance(nodist,hole)
else:
no_xray_df=nodist
adj_dec_df,core_dec_adj=iodp_funcs.adj_dec(no_xray_df,hole)
Convert SRM discrete sample data to MagIC:#
download data for a single hole from LORE as a csv file.
for OFFLINE treatments (ARM,IRM, DTECH AF, thermal), download both the “standard” and the extended file names
put the file into the HOLE/SRM_discrete_data directory in the HOLE directory
for “regular” SRM files (no offline treaments) edit the file name and execute the cell below:
filenames are set in the preliminaries cell
if srm_discrete_file:
convert.iodp_dscr_lore(srm_discrete_file,meas_file='srm_dscr_measurements.txt', \
dir_path=magic_dir,input_dir_path=srm_discrete_dir,spec_file='lims_specimens.txt')
if 'srm_dscr_measurements.txt' not in meas_files:meas_files.append('srm_dscr_measurements.txt')
dscr_file='srm_dscr_measurements.txt'
for OFFLINE treatments: specify the extended discrete srm file name.
NB: for this to work properly, you must follow these conventions when running the SRM in Offline mode:
put these in you comment field in the IMS-10 program for discrete samples:
for NRMs: NRM
for AF demag at 10 mT (for example) AF:10
for thermal at 200 C T:200
for ARM with 100mT AC and 50 uT DC: ARM:100:.05
for IRM in 1000 mT IRM:1000
set the filenames in the prelimaries cell
if dscr_ex_file and srm_discrete_file:
convert.iodp_dscr_lore(srm_discrete_file,dscr_ex_file=dscr_ex_file,meas_file='srm_dscr_measurements.txt', \
dir_path=magic_dir,input_dir_path=srm_discrete_dir,spec_file='lims_specimens.txt',\
offline_meas_file='srm_dscr_offline_measurements.txt')
if 'srm_dscr_measurements.txt' not in meas_files:meas_files.append('srm_dscr_measurements.txt')
if 'srm_dscr_offline_measurements.txt' not in meas_files:meas_files.append('srm_dscr_offline_measurments.txt')
ipmag.combine_magic(['srm_dscr_measurements.txt','srm_dscr_offline_measurements.txt'],
outfile='dscr_measurements.txt',dir_path=magic_dir)
dscr_file='dscr_measurements.txt'
To make some quickie zijderveld plots in the notebook set the following the True. To save all the plots, set save_plots to True
if dscr_file:
ipmag.zeq_magic(meas_file=dscr_file,\
spec_file='lims_specimens.txt',input_dir_path=magic_dir,n_plots="all",save_plots=False)
If you did a bunch of full demagnetizations, set max_field to your maximum af field. To save your plots, set save_plots to True. To change the output format, change ‘svg’ to whatever you want (‘pdf’,’eps’,’png’).
cnt=1
max_field=0 # set this to your peak field (in tesla) to execute this
if max_field:
srm_dscr_df=pd.read_csv(magic_dir+'/'+dscr_file,sep='\t',header=1)
srm_dmag=srm_dscr_df[srm_dscr_df.treat_ac_field>=max_field] # find all the demag specimens
spc_list=srm_dmag.specimen.unique()
for spc in spc_list:
ipmag.zeq_magic(meas_file=dscr_file,specimen=spc,fignum=cnt,\
spec_file='lims_specimens.txt',input_dir_path=magic_dir,save_plots=False,
fmt='svg')
cnt+=3;
Import the JR6 data.#
download the JR6 data from LIMS and put it in the JR6_data folder in your working directory.
edit the file name in the preliminaries cell to reflect the actual file name.
execute the two cells in order.
if jr6_file:
convert.iodp_jr6_lore(jr6_file,meas_file='jr6_measurements.txt',dir_path=magic_dir,\
input_dir_path=jr6_dir,spec_file='lims_specimens.txt',noave=False)
if 'jr6_measurements.txt' not in meas_files:meas_files.append('jr6_measurements.txt')
dscr_file='jr6_measurements.txt'
# combine srm and jr6 data, we can combine the data here:
if jr6_file and srm_discrete_file:
ipmag.combine_magic(['srm_dscr_measurements.txt','jr6_measurements.txt'],
outfile='dscr_measurements.txt',dir_path=magic_dir)
dscr_file='dscr_measurements.txt'
max_field=0 # set this to your peak field (in tesla) to execute this
if max_field:
cnt=1
dscr_df=pd.read_csv(magic_dir+'/dscr_measurements.txt',sep='\t',header=1)
dmag_df=dscr_df[dscr_df.treat_ac_field>=max_field] # find all the demag specimens
spc_list=dmag_df.specimen.unique()
for spc in spc_list:
ipmag.zeq_magic(meas_file='dscr_measurements.txt',specimen=spc,fignum=cnt,\
spec_file='lims_specimens.txt',input_dir_path=magic_dir,save_plots=False)
cnt+=3;
AMS data#
Convert AMS data to MagIC
Download KAPPABRIDGE expanded magnetic susceptibility data from the Lims Online Repository
place the downloaded .csv file in the KLY4S_data in the HOLE directory.
change kly4s_file to the correct file name in the preliminaries cell
if kly4s_file:
convert.iodp_kly4s_lore(kly4s_file, meas_out='kly4s_measurements.txt',
spec_infile='lims_specimens.txt', spec_out='kly4s_specimens.txt',
dir_path=magic_dir, input_dir_path=kly4s_dir,actual_volume=7)
if 'kly4s_measurements.txt' not in meas_files:meas_files.append('kly4s_measurements.txt')
if 'kly4s_specimens.txt' not in meas_files:meas_files.append('kly4s_specimens.txt')
To make a depth plot of your AMS data, change the False to True in the cell below and run it.
if False:
ipmag.ani_depthplot(spec_file='kly4s_specimens.txt', dir_path=magic_dir,
samp_file='lims_samples.txt',site_file='lims_sites.txt',
dmin=-1,dmax=-1,meas_file='kly4s_measurements.txt',
sum_file=core_summary_file)
plt.savefig('Figures/'+hole+'_anisotropy_xmastree.pdf')
This way makes equal area plots in core coordinates…. To run it, set the False to True. To save the plots, sset save_plots to True. for different options, try (help(ipmag.aniso_magic_nb)).
if False:
ipmag.aniso_magic_nb(infile=magic_dir+'/kly4s_specimens.txt',\
verbose=False,save_plots=False,ihext=False,iboot=True,ivec=True)
Downhole Plots#
Fill in the section summary file in the preliminaries folder and run the cells below
if dscr_file: # checks for discrete measurements, adds depths, etc.
srm_dscr_df=pd.read_csv(magic_dir+'/'+dscr_file,sep='\t',header=1)
dscr_df=srm_dscr_df.copy(deep=True)
dscr_df=dscr_df[srm_dscr_df['treat_ac_field']==demag_step]
depth_data=pd.read_csv(magic_dir+'/lims_sites.txt',sep='\t',header=1)
depth_data['specimen']=depth_data['site']
depth_data=depth_data[['specimen','core_depth']]
depth_data.sort_values(by='specimen')
dscr_df=pd.merge(dscr_df,depth_data,on='specimen')
else:
dscr_df="" # if no discrete samples.
if section_summary_file:
arch_demag_step=pd.read_csv(hole+'/'+hole+'_arch_demag_step.csv')
adj_dec_df=pd.read_csv(hole+'/'+hole+'_dec_adjusted.csv')
#Let's get the section boundaries.
# edit the file name for the Section Summary table downloaded from LIMS
summary_df=pd.read_csv(summary_file)
summary_df.dropna(subset=['Sect'],inplace=True)
if type(summary_df.Sect)!='str':
summary_df.Sect=summary_df.Sect.astype('int64')
summary_df.Sect=summary_df.Sect.astype('str')
summary_df=summary_df[summary_df['Sect'].str.contains('CC')==False]
max_depth=arch_demag_step['core_depth'].max()
summary_df=summary_df[summary_df['Top depth CSF-A (m)']<max_depth]
sect_depths=summary_df['Top depth CSF-A (m)'].values
summary_df['Core']=summary_df['Core'].astype('int')
labels=summary_df['Core'].astype('str')+summary_df['Type']+'-'+summary_df['Sect'].astype('str')
arch_demag_step=pd.read_csv(hole+'/'+hole+'_arch_demag_step.csv')
interval=100 # how to divide up the plots
depth_min,depth_max=0,interval
fignum=1
while depth_min<arch_demag_step.core_depth.max():
iodp_funcs.make_plot(arch_demag_step,adj_dec_df,sect_depths,hole,\
gad_inc,depth_min,depth_max,labels,spec_df=dscr_df,fignum=fignum)
depth_min+=interval
depth_max+=interval
fignum+=1
Tidy up for MagC#
Combine all the MagIC files for uploading to MagIC (http://earthref.org/MagIC).
if False: # when ready to upload, you can set this to true to make upload file.
ipmag.combine_magic(spec_files,outfile='specimens.txt',dir_path=magic_dir)
ipmag.combine_magic(samp_files,outfile='samples.txt',dir_path=magic_dir)
ipmag.combine_magic(site_files,outfile='sites.txt',dir_path=magic_dir)
ipmag.combine_magic(meas_files,outfile='measurements.txt',dir_path=magic_dir)
ipmag.upload_magic()