API Reference#
pmagpy.ipmag#
- class pmagpy.ipmag.Site(site_name, data_path, data_format='MagIC')[source]#
This Site class is for use within Jupyter/IPython notebooks. It reads in MagIC-formatted data (text files) and compiles fits, separates them by type, and plots equal-area projections inline. If means were not taken and output within the Demag GUI, it should automatically compute the Fisher mean for each fit type. Code is still a work in progress, but it is currently useful for succinctly computing/displaying data in notebook format.
- pmagpy.ipmag.aarm_magic(meas_file, dir_path='.', input_dir_path='', input_spec_file='specimens.txt', output_spec_file='specimens.txt')[source]#
Converts AARM data to best-fit tensor (6 elements plus sigma)
- Parameters:
meas_file (str) – input measurement file
dir_path (str) – output directory, default “.”
input_dir_path (str) – input file directory IF different from dir_path, default “”
input_spec_file (str) – input specimen file name, default “specimens.txt”
output_spec_file (str) – output specimen file name, default “specimens.txt”
- Returns:
True or False indicating if conversion was successful and output file name written
- Info:
Input for is a series of baseline, ARM pairs. The baseline should be the AF demagnetized state (3 axis demag is preferable) for the following ARM acquisition. The order of the measurements is:
- for 6 positions (AF demag before each step):
labfield parallel to X
labfield parallel to Y
labfield parallel to Z
labfield anti-parallel to X
labfield anti-parallel to Y
labfield anti-parallel to Z
- for 9 positions (AF demag before each step):
positions 1,2,3,6,7,8,11,12,13 (from Figure D.2 in Essentials, earthref.org/MagIC/books/Tauxe/Essentials, Appendix D)
- for 15 positions (AF demag before each step):
positions 1-15 (for 15 positions)
- pmagpy.ipmag.aarm_magic_dm2(infile, dir_path='.', input_dir_path='', spec_file='specimens.txt', samp_file='samples.txt', data_model_num=3, coord='s')[source]#
Converts AARM data to best-fit tensor (6 elements plus sigma)
- Parameters:
infile (str) – input measurement file
dir_path (str) – output directory, default “.”
input_dir_path (str) – input file directory IF different from dir_path, default “”
spec_file (str) – input/output specimen file name, default “specimens.txt”
samp_file (str) – input sample file name, default “samples.txt”
data_model_num (int) – MagIC data model [2, 3], default 3
coord (str) – coordinate system specimen/geographic/tilt-corrected, [‘s’, ‘g’, ‘t’], default ‘s’
- Returns:
- Tuple
True or False indicating if conversion was successful, output file name written
- Info:
Input for is a series of baseline, ARM pairs. The baseline should be the AF demagnetized state (3 axis demag is preferable) for the following ARM acquisition. The order of the measurements is:
positions 1,2,3, 6,7,8, 11,12,13 (for 9 positions)
positions 1,2,3,4, 6,7,8,9, 11,12,13,14 (for 12 positions)
positions 1-15 (for 15 positions)
- pmagpy.ipmag.ani_depthplot(spec_file='specimens.txt', samp_file='samples.txt', meas_file='measurements.txt', site_file='sites.txt', age_file='', sum_file='', fmt='svg', dmin=-1, dmax=-1, depth_scale='core_depth', dir_path='.', contribution=None)[source]#
returns matplotlib figure with anisotropy data plotted against depth available depth scales: ‘composite_depth’, ‘core_depth’ or ‘age’ (you must provide an age file to use this option). You must provide valid specimens and sites files, and either a samples or an ages file. You may additionally provide measurements and a summary file (csv).
- Parameters:
spec_file (str) – default “specimens.txt”
samp_file (str) – default “samples.txt”
meas_file (str) – default “measurements.txt”
site_file (str) – default “sites.txt”
age_file (str) – default “”
sum_file (str) – default “”
fmt (str) – str, default “svg” format for figures, [“svg”, “jpg”, “pdf”, “png”]
dmin (number) – default -1 minimum depth to plot (if -1, default to plotting all)
dmax (number) – default -1 maximum depth to plot (if -1, default to plotting all)
depth_scale (str) – default “core_depth” scale to plot, [‘composite_depth’, ‘core_depth’, ‘age’]. if ‘age’ is selected, you must provide an ages file.
dir_path (str) – default “.” directory for input files
contribution – cb.Contribution, default None if provided, use Contribution object instead of reading in data from files
- Returns:
- plot
matplotlib plot, or False if no plot could be created
- name
figure name, or error message if no plot could be created
- pmagpy.ipmag.ani_depthplot2(ani_file='rmag_anisotropy.txt', meas_file='magic_measurements.txt', samp_file='er_samples.txt', age_file=None, sum_file=None, fmt='svg', dmin=-1, dmax=-1, depth_scale='sample_core_depth', dir_path='.')[source]#
returns matplotlib figure with anisotropy data plotted against depth available depth scales: ‘sample_composite_depth’, ‘sample_core_depth’, or ‘age’ (you must provide an age file to use this option)
- pmagpy.ipmag.aniso_magic(infile='specimens.txt', samp_file='samples.txt', site_file='sites.txt', verbose=True, ipar=False, ihext=True, ivec=False, isite=False, iboot=False, vec=0, Dir=[], PDir=[], crd='s', num_bootstraps=1000, dir_path='.', fignum=1, save_plots=True, interactive=False, fmt='png', contribution=None, image_records=False)[source]#
Makes plots of anisotropy eigenvectors, eigenvalues and confidence bounds All directions are on the lower hemisphere.
- Parameters:
infile – specimens formatted file with aniso_s data
samp_file – samples formatted file with sample => site relationship
site_file – sites formatted file with site => location relationship
verbose – if True, print messages to output
ipar (confidence bound parameter) – if True - perform parametric bootstrap - requires non-blank aniso_s_sigma
ihext (confidence bound parameter) – if True - Hext ellipses
ivec (confidence bound parameter) – if True - plot bootstrapped eigenvectors instead of ellipses
isite (confidence bound parameter) – if True plot by site, requires non-blank samp_file
iboot (confidence bound parameter) – if True - bootstrap ellipses
vec – eigenvector for comparison with Dir
Dir – [Dec,Inc] list for comparison direction
PDir – [Pole_dec, Pole_Inc] for pole to plane for comparison green dots are on the lower hemisphere, cyan are on the upper hemisphere
crd – [‘s’,’g’,’t’], coordinate system for plotting whereby: s : specimen coordinates, aniso_tile_correction = -1, or unspecified g : geographic coordinates, aniso_tile_correction = 0 t : tilt corrected coordinates, aniso_tile_correction = 100
num_bootstraps – how many bootstraps to do, default 1000
dir_path – directory path
fignum – matplotlib figure number, default 1
save_plots – bool, default True if True, create and save all requested plots
interactive – bool, default False interactively plot and display for each specimen (this is best used on the command line only)
fmt – str, default “svg” format for figures, [svg, jpg, pdf, png]
contribution – pmagpy contribution_builder.Contribution object, if not provided will be created in directory (default None). (if provided, infile/samp_file/dir_path may be left blank)
- pmagpy.ipmag.aniso_magic_nb(infile='specimens.txt', samp_file='samples.txt', site_file='sites.txt', verbose=True, ipar=False, ihext=True, ivec=False, isite=False, iboot=False, vec=0, Dir=[], PDir=[], crd='s', num_bootstraps=1000, dir_path='.', fignum=1, save_plots=True, interactive=False, fmt='png', contribution=None)[source]#
Wrapper for aniso_magic
- pmagpy.ipmag.atrm_magic(meas_file, dir_path='.', input_dir_path='', input_spec_file='specimens.txt', output_spec_file='specimens.txt')[source]#
Converts ATRM data to best-fit tensor (6 elements plus sigma)
- Parameters:
meas_file (str) – input measurement file
dir_path (str) – output directory, default “.”
input_dir_path (str) – input file directory IF different from dir_path, default “”
input_spec_file (str) – input specimen file name, default “specimens.txt”
output_spec_file (str) – output specimen file name, default “specimens.txt”
- Returns:
(True or False indicating if conversion was successful, output file name written)
- Return type:
Tuple
- Info:
Input for is a series of ATRM measurements with optional alteration check The order of the measurements is:
- positions:
labfield parallel to X
labfield parallel to Y
labfield parallel to Z
labfield anti-parallel to X
labfield anti-parallel to Y
labfield anti-parallel to Z
optional: labfield parallel to X
- pmagpy.ipmag.atrm_magic_dm2(meas_file, dir_path='.', input_dir_path='', input_spec_file='specimens.txt', output_spec_file='specimens.txt', data_model_num=2)[source]#
Converts ATRM data to best-fit tensor (6 elements plus sigma)
- Parameters:
meas_file (str) – input measurement file
dir_path (str) – output directory, default “.”
input_dir_path (str) – input file directory IF different from dir_path, default “”
input_spec_file (str) – input specimen file name, default “specimens.txt”
output_spec_file (str) – output specimen file name, default “specimens.txt”
data_model_num (number) – MagIC data model [2, 3], default 3
- Returns:
Tuple
- Return type:
(True or False indicating if conversion was successful, output file name written)
- pmagpy.ipmag.azdip_magic(orient_file='orient.txt', samp_file='samples.txt', samp_con='1', Z=1, method_codes='FS-FD', location_name='unknown', append=False, output_dir='.', input_dir='.', data_model=3)[source]#
takes space delimited AzDip file and converts to MagIC formatted tables
- Parameters:
orient_file – name of azdip formatted input file
samp_file – name of samples.txt formatted output file
samp_con –
integer of sample orientation convention
[1] XXXXY: where XXXX is an arbitrary length site designation and Y is the single character sample designation. e.g., TG001a is the first sample from site TG001. [default]
[2] XXXX-YY: YY sample from site XXXX (XXX, YY of arbitrary length)
[3] XXXX.YY: YY sample from site XXXX (XXX, YY of arbitrary length)
[4-Z] XXXX[YYY]: YYY is sample designation with Z characters from site XXX
[5] site name same as sample
[6] site name entered in site_name column in the orient.txt format input file – NOT CURRENTLY SUPPORTED
[7-Z] [XXXX]YYY: XXXX is site designation with Z characters with sample name XXXXYYYY
method_codes –
colon delimited string with the following as desired
FS-FD field sampling done with a drill
FS-H field sampling done with hand samples
FS-LOC-GPS field location done with GPS
FS-LOC-MAP field location done with map
SO-POM a Pomeroy orientation device was used
SO-ASC an ASC orientation device was used
SO-MAG orientation with magnetic compass
location_name – location of samples
append – boolean. if True, append to the output file
output_dir – path to output file directory
input_dir – path to input file directory
data_model – MagIC data model.
- INPUT FORMAT
- Input files must be space delimited:
Samp Az Dip Strike Dip
- Orientation convention:
- Lab arrow azimuth = mag_azimuth; Lab arrow dip = 90-field_dip
e.g. field_dip is degrees from horizontal of drill direction
- Magnetic declination convention:
Az is already corrected in file
- pmagpy.ipmag.bin_trace(lon_samples, lat_samples, resolution)[source]#
Given a trace of samples in longitude and latitude, bin them in latitude and longitude, and normalize the bins so that the integral of probability density over the sphere is one.
The resolution keyword gives the number of divisions in latitude. The divisions in longitude is twice that.
- Parameters:
lon_samples – a list of longitudes
lat_samples – a list of latitudes
resolution – The resolution keyword gives the number of divisions in latitude. The divisions in longitude is twice that.
- pmagpy.ipmag.bingham_mean(dec=None, inc=None, di_block=None)[source]#
Calculates the Bingham mean and associated statistical parameters from either a list of declination values and a separate list of inclination values or from a di_block (a nested list of [dec, inc, 1.0]). Returns a dictionary with the Bingham mean and statistical parameters.
- Parameters:
dec – list of declinations
inc – list of inclinations or
di_block – a nested list of [dec,inc,1.0] A di_block can be provided instead of dec, inc lists in which case it will be used. Either dec, inc lists or a di_block need to passed to the function.
- Returns:
dictionary containing the Bingham mean and associated statistics.
Examples
Use lists of declination and inclination to calculate a Bingham mean:
>>> ipmag.bingham_mean(dec=[140,127,142,136],inc=[21,23,19,22]) {'Edec': 220.84075754194598, 'Einc': -13.745780972597291, 'Eta': 9.9111522306938742, 'Zdec': 280.38894136954474, 'Zeta': 9.8653370276451113, 'Zinc': 64.23509410796224, 'dec': 136.32637167111312, 'inc': 21.34518678073179, 'n': 4}
Use a di_block to calculate a Bingham mean (will give the same output as the example above with the lists):
>>> ipmag.bingham_mean(di_block=[[140,21],[127,23],[142,19],[136,22]])
- pmagpy.ipmag.bootstrap_fold_test(Data, num_sims=1000, min_untilt=-10, max_untilt=120, bedding_error=0, save=False, save_folder='.', fmt='svg', ninety_nine=False)[source]#
Conduct a bootstrap fold test (Tauxe and Watson, 1994)
Three plots are generated: 1) equal area plot of uncorrected data; 2) tilt-corrected equal area plot; 3) bootstrap results showing the trend of the largest eigenvalues for a selection of the pseudo-samples (red dashed lines), the cumulative distribution of the eigenvalue maximum (green line) and the confidence bounds that enclose 95% of the pseudo-sample maxima. If the confidence bounds enclose 100% unfolding, the data “pass” the fold test.
- Parameters:
Data – a numpy array of directional data [dec, inc, dip_direction, dip] dec, inc are the declination and inclination of the paleomagnetic directions dip_direction, dip are the orientation of the bedding
num_sims – number of bootstrap samples (default is 1000)
min_untilt – minimum percent untilting applied to the data (default is -10%)
max_untilt – maximum percent untilting applied to the data (default is 120%)
bedding_error – (circular standard deviation) for uncertainty on bedding poles
save – optional save of plots (default is False)
save_folder – path to directory where plots should be saved
fmt – format of figures to be saved (default is ‘svg’)
ninety_nine – changes confidence bounds from 95 percent to 99 if True
- Returns:
uncorrected data equal area plot
tilt-corrected data equal area plot
bootstrap results and CDF of the eigenvalue maximum
Examples
Data in separate lists of dec, inc, dip_direction, dip data can be made into the needed array using the
ipmag.make_diddd_array
function.>>> dec = [132.5,124.3,142.7,130.3,163.2] >>> inc = [12.1,23.2,34.2,37.7,32.6] >>> dip_direction = [265.0,265.0,265.0,164.0,164.0] >>> dip = [20.0,20.0,20.0,72.0,72.0] >>> data_array = ipmag.make_diddd_array(dec,inc,dip_direction,dip) >>> data_array array([[ 132.5, 12.1, 265. , 20. ], [ 124.3, 23.2, 265. , 20. ], [ 142.7, 34.2, 265. , 20. ], [ 130.3, 37.7, 164. , 72. ], [ 163.2, 32.6, 164. , 72. ]])
This array can then be passed to the function:
>>> ipmag.bootstrap_fold_test(data_array)
- pmagpy.ipmag.calculate_aniso_parameters(K, n_pos=6)[source]#
calculate anisotropy parameters from n_pos positions plus optional baseline measurements
- pmagpy.ipmag.chi_magic(infile='measurements.txt', dir_path='.', experiments='', fmt='svg', save_plots=True, interactive=False, contribution=None)[source]#
- Parameters:
infile – str, default “measurements.txt” measurement infile
dir_path – str, default “.” input directory
experiments – str, default “” experiment name to plot
fmt – str, default “svg” format for figures, [“svg”, “jpg”, “pdf”, “png”]
save_plots – bool, default True save figures
interactive – bool, default False if True, interactively plot and display (this is best used on the command line only)
contribution – cb.Contribution, default None if provided, use Contribution object instead of reading in data from files
- Returns:
True or False indicating if conversion was successful, file name(s) written
- Return type:
(status, output_files) - Tuple
- pmagpy.ipmag.chi_magic2(path_to_file='.', file_name='magic_measurements.txt', save=False, save_folder='.', fmt='svg')[source]#
Generates plots that compare susceptibility to temperature at different frequencies.
- Parameters:
specified) ((defaults are used if not)
path_to_file – path to directory that contains file (default is current directory, ‘.’)
file_name – name of file to be opened (default is ‘magic_measurements.txt’)
save – boolean argument to save plots (default is False)
save_folder – relative directory where plots will be saved (default is current directory, ‘.’)
- pmagpy.ipmag.combine_magic(filenames, outfile='measurements.txt', data_model=3, magic_table='measurements', dir_path='.', input_dir_path='')[source]#
Takes a list of magic-formatted files, concatenates them, and creates a single file. Returns output filename if the operation was successful.
- Parameters:
filenames – list of MagIC formatted files
outfile – name of output file [e.g., measurements.txt]
data_model – data model number (2.5 or 3), default 3
magic_table – name of magic table, default ‘measurements’
dir_path – str output directory, default “.”
input_dir_path – str input file directory (if different from dir_path), default “”
- Returns:
outfile name if success, False if failure
- pmagpy.ipmag.common_mean_bayes(Data1, Data2, reversal_test=False)[source]#
Estimate the probability that two Fisher-distributed sets of directions originate from populations with a common mean using the Bayesian framework of Heslop and Roberts (2018). This version of the test is the one involving distributions with common precision.
- Parameters:
Data1 – a nested list of directional data [dec,inc] (a di_block)
Data2 – a nested list of directional data [dec,inc] (a di_block)
reversal_test – whether to flip one populations to its antipode (default is False)
- Returns:
BF0 (Bayes factor), P (posterior probability of the hypothesis), support (category of support based on classification of P)
- pmagpy.ipmag.common_mean_bootstrap(Data1, Data2, NumSims=1000, color1='r', color2='b', save=False, save_folder='.', fmt='svg', figsize=(7, 2.3), x_tick_bins=4, verbose=True)[source]#
Conduct a bootstrap test (Tauxe, 2010) for a common mean on two declination, inclination data sets. Plots are generated of the cumulative distributions of the Cartesian coordinates of the means of the pseudo-samples (one for x, one for y and one for z). If the 95 percent confidence bounds for each component overlap, the two set of directions “pass” the test and are consistent with sharing a common mean.
- Parameters:
Data1 – a nested list of directional data [dec,inc] (a di_block)
Data2 – a nested list of directional data [dec,inc] (a di_block) if Data2 is length of 1, treat as single direction
NumSims – number of bootstrap samples (default is 1000)
save – optional save of plots (default is False)
save_folder – path to directory where plots should be saved
fmt – format of figures to be saved (default is ‘svg’)
figsize – optionally adjust figure size (default is (7, 2.3))
x_tick_bins – because they occasionally overlap depending on the data, this argument allows you adjust number of tick marks on the x axis of graphs (default is 4)
- Returns:
three plots (cumulative distributions of the X, Y, Z of bootstrapped means, result (a boolean where 0 is fail and 1 is pass)
Examples
Develop two populations of directions using
ipmag.fishrot()
. Use the function to determine if they share a common mean.>>> directions_A = ipmag.fishrot(k=20, n=30, dec=40, inc=60) >>> directions_B = ipmag.fishrot(k=35, n=25, dec=42, inc=57) >>> ipmag.common_mean_bootstrap(directions_A, directions_B)
- pmagpy.ipmag.common_mean_bootstrap_H23(Data1, Data2, num_sims=10000, alpha=0.05, plot=True, reversal=False, save=False, save_folder='.', fmt='svg', verbose=False)[source]#
Perform a bootstrap common mean direction test following Heslop et al. (2023).
This function uses a nonparametric bootstrap approach to test the null hypothesis of common mean directions between two datasets. It extends the bootstrap common mean direction test of Tauxe et al. 1991 by incorporating a null hypothesis significance testing framework.
- Parameters:
Data1 (array) – Directional data of the first set; each row is [declination, inclination].
Data2 (array) – Directional data of the second set; each row is [declination, inclination].
num_sims (int, optional) – Number of bootstrap simulations to run. Default is 10000.
alpha (float, optional) – Significance level for hypothesis testing. Default is 0.05.
plot (bool, optional) – If True, produces a histogram plot of the test statistic. Default is True.
reversal (bool, optional) – If True, considers antipodal directions for the second dataset. Default is False.
save (bool, optional) – If True, saves the histogram plot. Default is False.
save_folder (str, optional) – Directory where the histogram plot will be saved. Default is the current directory.
fmt (str, optional) – File format for saving the histogram plot. Default is ‘svg’.
- Returns:
- Contains the following elements:
result (int): 0 if null hypothesis is rejected, 1 otherwise.
Lmin (float): The test statistic value.
Lmin_c (float): The critical test statistic value.
p (float): The p-value of the test.
- Return type:
tuple
References
Heslop, D., Scealy, J. L., Wood, A. T. A., Tauxe, L., & Roberts, A. P. (2023). A bootstrap common mean direction test. Journal of Geophysical Research: Solid Earth, 128, e2023JB026983. https://doi.org/10.1029/2023JB026983
- pmagpy.ipmag.common_mean_watson(Data1, Data2, NumSims=5000, print_result=True, plot='no', save=False, save_folder='.', fmt='svg')[source]#
Conduct a Watson V test for a common mean on two directional data sets.
This function calculates Watson’s V statistic from input lists through Monte Carlo simulation in order to test whether two populations of directional data could have been drawn from a common mean. The critical angle between the two sample mean directions and the corresponding McFadden and McElhinny (1990) classification is printed.
- Parameters:
Data1 – a nested list of directional data [dec,inc] (a di_block)
Data2 – a nested list of directional data [dec,inc] (a di_block)
NumSims – number of Monte Carlo simulations (default is 5000)
print_result – default is to print the test result (True)
plot – the default is no plot (‘no’). Putting ‘yes’ will the plot the CDF from the Monte Carlo simulations.
save – optional save of plots (default is False)
save_folder – path to where plots will be saved (default is current)
fmt – format of figures to be saved (default is ‘svg’)
- Returns:
printed text (text describing the test result is printed), result (a boolean where 0 is fail and 1 is pass), angle (angle between the Fisher means of the two data sets), critical_angle (critical angle for the test to pass), classification (MM1990 classification for a positive test),
Examples
Develop two populations of directions using
ipmag.fishrot
. Use the function to determine if they share a common mean.>>> directions_A = ipmag.fishrot(k=20, n=30, dec=40, inc=60) >>> directions_B = ipmag.fishrot(k=35, n=25, dec=42, inc=57) >>> ipmag.common_mean_watson(directions_A, directions_B)
- pmagpy.ipmag.conglomerate_test_Watson(R, n)[source]#
The Watson (1956) test of a directional data set for randomness compares the resultant vector (R) of a group of directions to values of Ro. If R exceeds Ro, the null hypothesis of randomness is rejected. If R is less than Ro, the null hypothesis of randomness is considered to not be rejected.
- Parameters:
R – the resultant vector length of the directions
n – the number of directions
- Returns:
printed text (text describing test result), result (a dictionary with the Watson (1956) R values)
- pmagpy.ipmag.contribution_to_magic(contribution, dir_path='.')[source]#
Write a contribution object to MagIC-formatted files in the specified directory. Compiles these files into a upload.txt file which can be uploaded into the MagIC database using the upload_magic function.
- Parameters:
contribution (Contribution) – A contribution object containing tables to be written to a MagIC-formatted file.
dir_path (str) – The directory path where the MagIC-formatted file will be written.
- pmagpy.ipmag.core_depthplot(input_dir_path='.', meas_file='measurements.txt', spc_file='', samp_file='samples.txt', age_file='', sum_file='', wt_file='', depth_scale='core_depth', dmin=-1, dmax=-1, sym='bo', size=5, spc_sym='ro', spc_size=5, meth='', step=0, fmt='svg', pltDec=True, pltInc=True, pltMag=True, pltLine=True, pltSus=True, logit=False, pltTime=False, timescale=None, amin=-1, amax=-1, norm=False, data_model_num=3, location='')[source]#
depth scale can be ‘core_depth’ or ‘composite_depth’ (for data model=3) if age file is provided, depth_scale will be set to ‘age’ by default. You must provide at least a measurements,specimens and sample file to plot.
- Parameters:
input_dir_path – str, default “.” file input directory
meas_file – str, default “measurements.txt” input measurements file
spc_file – str, default “” input specimens file
samp_file – str, default “” input samples file
age_file – str, default “” input ages file
sum_file – str, default “” input csv summary file
wt_file – str, default “” input file with weights
depth_scale – str, default “core_depth” [‘core_depth’, ‘composite_depth’]
dmin – number, default -1 minimum depth to plot (if -1, default to plotting all)
dmax – number, default -1 maximum depth to plot (if -1, default to plotting all)
sym – str, default “bo” symbol color and shape, default blue circles (see matplotlib documentation for more options)
size – int, default 5 symbol size
spc_sym – str, default ‘ro’ specimen symbol color and shape, default red circles (see matplotlib documentation for more options)
meth – str, default “” method codes, [“LT-NO”, “AF”, “T”, “ARM”, “IRM”, “X”]
step –
int, default 0 treatment step for plotting:
for AF, in mT, for T, in C
fmt – str, default “svg” format for figures, [svg,jpg,png,pdf]
pltDec – bool, default True plot declination
pltInc – bool, default True plot inclination
pltMag – bool, default True plot magnetization
pltLine – bool, default True connect dots with a line
pltSus – bool, default True plot blanket treatment
logit – bool, default False plot magnetization on a log scale
amin – int, default -1 minimum time to plot (if -1, default to plotting all)
amax – int, default -1 maximum time to plot (if -1, default to plotting all)
norm – bool, default False normalize by weight
data_model_num – int, default 3 MagIC data model (please, use data model 3)
- Returns:
main_plot, figname
- pmagpy.ipmag.create_private_contribution(username='', password='')[source]#
Create a private contribution on earthref.org/MagIC.
- Parameters:
username – str personal username for MagIC
password – str password for username
- Returns:
- response API requests.models.Response
- response.status_code: bool
True : successful creation of private workspace
- response[‘url’]str
URL of request
- response[‘method’]str
’POST’
- response[‘id’]str
if successful, MagIC ID number created
- response[‘errors’]str
if unsuccessful, error message
- pmagpy.ipmag.criteria_extract(crit_file='criteria.txt', output_file='criteria.xls', output_dir_path='.', input_dir_path='', latex=False)[source]#
Extracts criteria from a MagIC 3.0 format criteria.txt file. Default output format is an Excel file. typeset with latex on your own computer.
- Parameters:
crit_file – str, default “criteria.txt” input file name
output_file – str, default “criteria.xls” output file name
output_dir_path – str, default “.” output file directory
input_dir_path – str, default “” path for intput file if different from output_dir_path (default is same)
latex – boolean, default False if True, output file should be latex formatted table with a .tex ending
- Returns :
[True,False], data table error type : True if successful
- Effects :
writes xls or latex formatted tables for use in publications
- pmagpy.ipmag.cumulative_density_distribution(lon_samples, lat_samples, resolution=30)[source]#
compute cumulative density distribution of a set of vectors on a unit sphere
- Parameters:
lon_samples – a list of longitudes
lat_samples – a list of latitudes
resolution – the resolution at which to calculate the vectors distribution. the higher the number, the finer the resolution
- Returns:
- Tuple
longitude grid, latitude grid, and cumulative densities
- pmagpy.ipmag.curie(path_to_file='.', file_name='', magic=False, window_length=3, save=False, save_folder='.', fmt='svg', t_begin='', t_end='')[source]#
Plots and interprets curie temperature data. The 1st derivative is calculated from smoothed M-T curve (convolution with triangular window with width= <-w> degrees) The 2nd derivative is calculated from smoothed 1st derivative curve (using the same sliding window width) The estimated curie temperation is the maximum of the 2nd derivative. Temperature steps should be in multiples of 1.0 degrees.
- Parameters:
file_name – name of file to be opened
path_to_file – path to directory that contains file (default is current directory, ‘.’)
window_length – dimension of smoothing window (input to smooth() function)
save – boolean argument to save plots (default is False)
save_folder – relative directory where plots will be saved (default is current directory, ‘.’)
fmt – format of saved figures (default is svg)
t_begin – start of truncated window for search (default is beginning of data)
t_end – end of truncated window for search (default is end of data)
magic – True if MagIC formatted measurements.txt file
- Returns:
A plot is shown and saved if save=True.
- pmagpy.ipmag.dayplot_magic(path_to_file='.', hyst_file='specimens.txt', rem_file='', save=True, save_folder='.', fmt='svg', data_model=3, interactive=False, contribution=None, image_records=False)[source]#
Makes ‘day plots’ (Day et al. 1977) and squareness/coercivity plots (Neel, 1955; plots after Tauxe et al., 2002); plots ‘linear mixing’ curve from Dunlop and Carter-Stiglitz (2006).
- Parameters:
path_to_file – path to directory that contains files (default is current directory, ‘.’)
(data_model=3 (the default input file is 'specimens.txt')
2 (if data_model =) – hyst_file : hysteresis file (default is ‘rmag_hysteresis.txt’) rem_file : remanence file (default is ‘rmag_remanence.txt’)
defaults (then must these are the) – hyst_file : hysteresis file (default is ‘rmag_hysteresis.txt’) rem_file : remanence file (default is ‘rmag_remanence.txt’)
save – boolean argument to save plots (default is True)
save_folder – relative directory where plots will be saved (default is current directory, ‘.’)
fmt – format of saved figures (default is ‘pdf’)
image_records (boolean) – generate and return a record for each image in a list of dicts which can be ingested by pmag.magic_write, default is False
- pmagpy.ipmag.delete_private_contribution(contribution_id, username='', password='')[source]#
Delete a private contribution on earthref.org/MagIC.
- Parameters:
contribution_id – int ID of MagIC contribution to delete
username – str personal username for MagIC
password – str password for username
- Returns:
- response (API requests.models.Response)
- response.status_code: bool
True : successful creation of private workspace
- response[‘url’]str
URL of request
- response[‘method’] :
’DELETE’
- response[‘id’]str
if successful, MagIC ID contribution deleted
- response[‘errors’]str
if unsuccessful, error message
- pmagpy.ipmag.demag_magic(path_to_file='.', file_name='magic_measurements.txt', save=False, save_folder='.', fmt='svg', plot_by='loc', treat=None, XLP='', individual=None, average_measurements=False, single_plot=False)[source]#
Takes demagnetization data (from magic_measurements file) and outputs intensity plots (with optional save).
- Parameters:
path_to_file – path to directory that contains files (default is current directory, ‘.’)
file_name – name of measurements file (default is ‘magic_measurements.txt’)
save – boolean argument to save plots (default is False)
save_folder – relative directory where plots will be saved (default is current directory, ‘.’)
fmt – format of saved figures (default is ‘svg’)
plot_by – specifies what sampling level you wish to plot the data at (‘loc’ – plots all samples of the same location on the same plot ‘exp’ – plots all samples of the same expedition on the same plot ‘site’ – plots all samples of the same site on the same plot ‘sample’ – plots all measurements of the same sample on the same plot ‘spc’ – plots each specimen individually)
treat – treatment step ‘T’ = thermal demagnetization ‘AF’ = alternating field demagnetization ‘M’ = microwave radiation demagnetization (default is ‘AF’)
XLP – filter data by a particular method
individual – This function outputs all plots by default. If plotting by sample or specimen, you may not wish to see (or wait for) every single plot. You can therefore specify a particular plot by setting this keyword argument to a string of the site/sample/specimen name.
average_measurements – Option to average demagnetization measurements by the grouping specified with the ‘plot_by’ keyword argument (default is False)
single_plot – Option to output a single plot with all measurements (default is False)
- pmagpy.ipmag.density_distribution(lon_samples, lat_samples, resolution=30)[source]#
calculate density distribution of a given set of vectos on a sphere
- Parameters:
lon_samples – a list of longitudes
lat_samples – a list of latitudes
resolution – the resolution at which to calculate the vectors distribution. the higher the number, the finer the resolution (default is 30)
- pmagpy.ipmag.df_depthplot(df, d_key='core_depth', fmt='png', location='unknown', save=False)[source]#
Makes depth (or height) plots of various columns in the dataframe
- Parameters:
df – pandas dataframe
d_key (str) – name of column for plotting against [‘core_depth’,’composite_depth’,’height’]
fmt (str) – format of saved figure, default is ‘png’
location (str) – name of location
save (boolean) – if True, save plot to location.fmt
- pmagpy.ipmag.dmag_magic(in_file='measurements.txt', dir_path='.', input_dir_path='', spec_file='specimens.txt', samp_file='samples.txt', site_file='sites.txt', loc_file='locations.txt', plot_by='loc', LT='AF', norm=True, XLP='', save_plots=True, fmt='svg', interactive=False, n_plots=5, contribution=None)[source]#
plots intensity decay curves for demagnetization experiments
- Parameters:
in_file (str) – default “measurements.txt”
dir_path (str) – output directory, default “.”
input_dir_path (str) – input file directory (if different from dir_path), default “”
spec_file (str) – input specimen file name, default “specimens.txt”
samp_file (str) – input sample file name, default “samples.txt”
site_file (str) – input site file name, default “sites.txt”
loc_file (str) – input location file name, default “locations.txt”
plot_by (str) – [spc, sam, sit, loc] (specimen, sample, site, location), default “loc”
LT (str) – lab treatment [T, AF, M], default AF
norm (bool) – normalize by NRM magnetization, default True
XLP (str) – exclude specific lab protocols, (for example, method codes like LP-PI) default “”
save_plots (bool) – plot and save non-interactively, default True
fmt (str) – str [“png”, “svg”, “pdf”, “jpg”], default “svg”
interactive (bool) – default False interactively plot and display for each specimen (this is best used on the command line only)
n_plots (int) – default 5 maximum number of plots to make if you want to make all possible plots, specify “all”
contribution – cb.Contribution, default None if provided, use Contribution object instead of reading in data from files
- Returns:
True or False indicating if conversion was successful, file name(s) written
- pmagpy.ipmag.dms2dd(degrees, minutes, seconds)[source]#
Convert latitude/longitude of a location that is in degrees, minutes, seconds to decimal degrees
- Parameters:
degrees – degrees of latitude/longitude
minutes – minutes of latitude/longitude
seconds – seconds of latitude/longitude
- Returns:
- float
decimal degrees of location
Examples
Convert 180 degrees 4 minutes 23 seconds to decimal degrees:
>>> ipmag.dms2dd(180,4,23) 180.07305555555556
- pmagpy.ipmag.do_flip(dec=None, inc=None, di_block=None, unit_vector=True)[source]#
This function returns the antipode (i.e. it flips) of directions.
The function can take dec and inc as separate lists if they are of equal length and explicitly specified or are the first two arguments. It will then return a list of flipped decs and a list of flipped incs. If a di_block (a nested list of [dec, inc, 1.0]) is specified then it is used and the function returns a di_block with the flipped directions.
- Parameters:
dec – list of declinations
inc – list of inclinations
di_block – a nested list of [dec, inc, 1.0] A di_block can be provided instead of dec, inc lists in which case it will be used. Either dec, inc lists or a di_block need to passed to the function.
unit_vector – if True will return [dec,inc,1.]; if False will return [dec,inc]
True) ((default is)
- Returns:
either dec_flip, inc_flip as lists of flipped declinations and inclinations or dflip as a nested list of [dec, inc, 1.0] or [dec, inc]
Examples
Lists of declination and inclination can be flipped to their antipodes:
>>> decs = [1.0, 358.0, 2.0] >>> incs = [10.0, 12.0, 8.0] >>> ipmag.do_flip(decs, incs) ([181.0, 178.0, 182.0], [-10.0, -12.0, -8.0])
The function can also take a di_block and returns a flipped di_block:
>>> directions = [[1.0,10.0],[358.0,12.0,],[2.0,8.0]] >>> ipmag.do_flip(di_block=directions) [[181.0, -10.0, 1.0], [178.0, -12.0, 1.0], [182.0, -8.0, 1.0]]
- pmagpy.ipmag.download_magic(infile=None, dir_path='.', input_dir_path='', overwrite=False, print_progress=True, data_model=3.0, separate_locs=False, txt='', excel=False)[source]#
Takes the name of a text file downloaded from the MagIC database and unpacks it into MagIC-formatted files. by default, download_magic assumes that you are doing everything in your current directory. if not, you may provide optional arguments dir_path (where you want the results to go) and input_dir_path (where the downloaded file is IF that location is different from dir_path).
- Parameters:
infile – str MagIC-format file to unpack
dir_path – str output directory (default “.”)
input_dir_path – str, default “” path for intput file if different from output_dir_path (default is same)
overwrite – bool overwrite current directory (default False)
print_progress – bool verbose output (default True)
data_model – float MagIC data model 2.5 or 3 (default 3)
separate_locs – bool create a separate directory for each location (Location_*) (default False)
txt – str, default “” if infile is not provided, you may provide a string with file contents instead (useful for downloading MagIC file directly from earthref)
excel – bool input file is an excel spreadsheet (as downloaded from MagIC)
- Returns:
- bool
True if the unpacking operation is successful. False otherwise.
- pmagpy.ipmag.download_magic_from_doi(doi)[source]#
Download a public contribution matching the provided DOI from earthref.org/MagIC.
- Parameters:
doi – str DOI for a MagIC
- Returns:
bool message : str
Error message if download didn’t succeed
- Return type:
result
- pmagpy.ipmag.download_magic_from_id(magic_id, directory='.', share_key='')[source]#
Downloads a contribution from earthref.org/MagIC using the provided ID and saves it to the specified directory. If the directory does not exist, it is created. If a share_key is provided, it downloads a private contribution.
- Parameters:
magic_id (str) – Unique ID for a MagIC contribution.
directory (str) – Path to save the file. Defaults to current directory.
share_key (str) – Share key for downloading from Private Contribution; default is “” for public contribution.
- Returns:
True if successful, False otherwise. str: Relative file path if successful or error message if failed.
- Return type:
bool
- pmagpy.ipmag.eigs_s(infile='', dir_path='.')[source]#
Converts eigenparamters format data to s format
- Parameters:
Input –
- fileinput file name with eigenvalues (tau) and eigenvectors (V) with format:
tau_1 V1_dec V1_inc tau_2 V2_dec V2_inc tau_3 V3_dec V3_inc
- Returns:
- the six tensor elements as a nested array
[[x11,x22,x33,x12,x23,x13],….]
- pmagpy.ipmag.ellipse(map_axis, centerlon, centerlat, major_axis, minor_axis, angle, n=360, filled=False, transform=None, **kwargs)[source]#
This function enables general error ellipses to be drawn on the cartopy projection of the input map axis using a center and a set of major and minor axes and a rotation angle east of north. (Adapted from equi).
- Parameters:
map_axis (cartopy axis)
centerlon (longitude of the center of the ellipse)
centerlat (latitude of the center of the ellipse)
major_axis (Major axis of ellipse in km)
minor_axis (Minor axis of ellipse in km)
angle (angle of major axis in degrees east of north)
n (number of points with which to apporximate the ellipse)
filled (boolean specifying if the ellipse should be plotted as a filled polygon or) – as a set of line segments (Doesn’t work right now)
kwargs (any other key word arguments can be passed for the line)
Returns – The map object with the ellipse plotted on it
- pmagpy.ipmag.eqarea_magic(in_file='sites.txt', dir_path='.', input_dir_path='', spec_file='specimens.txt', samp_file='samples.txt', site_file='sites.txt', loc_file='locations.txt', plot_by='all', crd='g', ignore_tilt=False, save_plots=True, fmt='svg', contour=False, color_map='coolwarm', plot_ell='', n_plots=5, interactive=False, contribution=None, source_table='sites', image_records=False)[source]#
makes equal area projections from declination/inclination data
- Parameters:
in_file – str, default “sites.txt”
dir_path – str output directory, default “.”
input_dir_path – str input file directory (if different from dir_path), default “”
spec_file – str input specimen file name, default “specimens.txt”
samp_file – str input sample file name, default “samples.txt”
site_file – str input site file name, default “sites.txt”
loc_file – str input location file name, default “locations.txt”
plot_by – str [spc, sam, sit, loc, all] (specimen, sample, site, location, all), default “all”
crd – [‘s’,’g’,’t’], coordinate system for plotting whereby: s : specimen coordinates, aniso_tile_correction = -1 g : geographic coordinates, aniso_tile_correction = 0 (default) t : tilt corrected coordinates, aniso_tile_correction = 100
ignore_tilt – bool default False. If True, data are unoriented (allows plotting of measurement dec/inc)
save_plots – bool plot and save non-interactively, default True
fmt – str [“png”, “svg”, “pdf”, “jpg”], default “svg”
contour – bool plot as color contour
colormap – str color map for contour plotting, default “coolwarm” see cartopy documentation for more options
plot_ell – str [F,K,B,Be,Bv] plot Fisher, Kent, Bingham, Bootstrap ellipses or Bootstrap eigenvectors default “” plots none
n_plots – int maximum number of plots to make, default 5 if you want to make all possible plots, specify “all”
interactive – bool, default False interactively plot and display for each specimen (this is best used on the command line or in the Python interpreter)
contribution – cb.Contribution, default None if provided, use Contribution object instead of reading in data from files
source_table – table to get plot data from (only needed with contribution argument) for example, you could specify source_table=”measurements” and plot_by=”sites” to plot measurement data by site. default “sites”
image_records – generate and return a record for each image in a list of dicts which can be ingested by pmag.magic_write bool, default False
- Returns:
- if image_records == False
type - Tuple : (True or False indicating if conversion was successful, file name(s) written)
- if image_records == True
True or False indicating if conversion was successful, output file name written, list of image recs
- pmagpy.ipmag.equi(map_axis, centerlon, centerlat, radius, color, alpha=1.0, outline=True, fill=False, lw=1)[source]#
This function enables A95 error ellipses to be drawn in cartopy around paleomagnetic poles in conjunction with shoot (modified from: http://www.geophysique.be/2011/02/20/matplotlib-basemap-tutorial-09-drawing-circles/).
- Parameters:
map_axis – cartopy axis
centerlon – longitude of the center of the ellipse
centerlat – latitude of the center of the ellipse
radius – radius of ellipse (in degrees)
color – color of ellipse
alpha – transparency - if filled, the transparency will only apply to the facecolor of the ellipse
outline – boolean specifying if the ellipse should be plotted as a filled polygon or as a set of line segments
fill – boolean specifying if the ellipse should be plotted as a filled polygon
- pmagpy.ipmag.f_factor_calc(inc_observed, inc_field)[source]#
Calculate the flattening factor (f) from an observed inclination in comparison to the expected inclination.
- Parameters:
inc_observed – the observed inclination (e.g. magnetization of sediment)
inc_field – inclination of field in which magnetization was acquired
- Returns:
the flattening factor
- Return type:
f_factor
Examples
Calculate the f factor for an inclination that was shallowed from 40 degrees to 25 degrees:
>>> ipmag.f_factor_calc(25,40) 0.5557238268604126
- pmagpy.ipmag.find_compilation_kent(plon, plat, A95, slon, slat, f_from_compilation=None, n=10000, n_fish=100, return_poles=False, return_kent_stats=True, return_paleolats=False, map_central_longitude=0, map_central_latitude=0)[source]#
Applies flattening factors from the compilation to sedimentary paleomagnetic pole where only pole longitude, pole latitude, A95, site longitude, and site latitude are available.
First, calculate the paleomagnetic direction at the site of the mean pole using plon, plat via pmag.vgp_di. Then draw n resamples from the compiled f values in the compilation. The default compilation of Pierce et al., 2022 can be used or the user can provide their own compilation.
Unsquish the directions with the resampled f factors, then convert the mean directions back to pole space. Making the simplifying assumption that A95 is the same as the directions are unflattened. Resample n_fish mean poles from the Fisher distribution given the unsquished plon, plat, and A95. This will result in a total of n*n_fish number of resampled mean poles. Summarize the distribution of the mean poles using a Kent distribution.
- Parameters:
plon – legacy mean pole longitude
plat – legacy mean pole latitude
A95 – legacy mean pole A95
slon – site longitude
slat – site latitude
f_from_compilation – list of f factors (default is None in which case the compilation of Pierce et al., 2022 Table S1 will be used)
n – number of resamples from compilation (default is 10000)
n_fish – number of resamples from each Fisher mean pole position (default is 100)
return_poles – whether or not to return the resampled mean pole positions (default is False)
return_kent_stats – whether or not to return the calculated Kent distribution statistics of the resampled mean poles (default is True)
return_paleolats – whether or not to return the computed compilation paleolatitudes (default is False)
map_central_longitude – central longitude for the orthographic map (default is 0)
map_central_latitude – central latitude for the orthographic map (default is 0)
- Returns:
compilation_mean_lons, compilation_mean_lats: resampled mean pole positions
f_compilation_kent_distribution_95: Kent distribution statistics
compilation_paleolats: computed compilation paleolatitudes
- Return type:
Depending on the combination of boolean flags provided, returns one or more of
- pmagpy.ipmag.find_ei(data, nb=1000, save=False, save_folder='.', fmt='svg', site_correction=False, return_new_dirs=False, figprefix='EI', return_values=False, num_resample_to_plot=1000, data_color='k', EI_color='r', resample_EI_color='grey', resample_EI_alpha=0.05, tight_axes=False)[source]#
Applies series of assumed flattening factors and “unsquishes” inclinations assuming tangent function. Finds flattening factor that gives elongation/inclination pair consistent with TK03; or, if correcting by site instead for study-level secular variation, finds flattening factor that minimizes elongation and most resembles a Fisherian distribution. Finds bootstrap confidence bounds
- Parameters:
data – a nested list of dec/inc pairs
nb – number of bootstrapped pseudo-samples (default is 1000)
save – Boolean argument to save plots (default is False)
save_folder – path to folder in which plots should be saved (default is current directory)
fmt – specify format of saved plots (default is ‘svg’)
figfile – name of saved file plus format string
site_correction – Boolean argument to specify whether to “unsquish” data to 1) the elongation/inclination pair consistent with TK03 secular variation model (site_correction = False) or 2) a Fisherian distribution (site_correction = True). Default is FALSE. Note that many directions (~ 100) are needed for this correction to be reliable.
return_new_dirs – optional return of newly “unflattened” directions as di_block (default is False)
return_values – optional return of all bootstrap result inclinations, elongations, and f factors (default is False)
return_values=True (if both return_new_dirs=True and) – di_block of new directions, inclinations, elongations,and f factors
return (the function will) – di_block of new directions, inclinations, elongations,and f factors
num_resample_to_plot – number of bootstrap resample elongation/inclination curves to plot (default to to plot all)
data_color – the color of the direction equal area plot data (default is black)
EI_color – the color of the EI curve associated with the most frequent f value (rounded to 2 decimal points, default is red)
resample_EI_color – the color of the EI curves for all f values except for the most frequent f (default is grey)
resample_EI_alpha – the transparency of the EI curves for all f values except for the most frequent f (default is grey)
tight_axes – optional argument to tighten up the axes limits for the inclination-elongation figure
- Returns:
equal area plot of original directions
Elongation/inclination pairs as a function of f, data plus 25 bootstrap samples
- Cumulative distribution of bootstrapped optimal inclinations plus uncertainties.
Estimate from original data set plotted as solid line
Orientation of principle direction through unflattening
Note
If distribution does not have a solution, plot labeled: Pathological. Some bootstrap samples may have valid solutions and those are plotted in the CDFs and E/I plot.
- pmagpy.ipmag.find_ei_kent(data, site_latitude, site_longitude, kent_color='k', nb=1000, save=False, save_folder='.', fmt='svg', return_new_dirs=False, return_values=False, figprefix='EI', num_resample_to_plot=1000, EI_color='r', resample_EI_color='grey', resample_EI_alpha=0.05, vgp_nb=100, cmap='viridis_r', central_longitude=0, central_latitude=0)[source]#
Applies series of assumed flattening factor and “unsquishes” inclinations assuming tangent function. Finds flattening factor that gives elongation/inclination pair consistent with TK03 Finds bootstrap confidence bounds Based on all flattening factors from the E/I bootstrap results find the distribution of paleolatitudes and fit with a normal distribution Based on all flattening factors from the E/I bootstrap results calculate the correspondant VGP pole positions and their mean poles associated with each factor Perform Monte Carlo resample of the mean poles associated with each flattening factor Finds the Kent distribution statistics: mean, major and minor axes and their associated angles of dispersion.
- Parameters:
data – a nested list of dec/inc pairs
site_latitude – location of the paleomagnetic site
site_longitude – location of the paleomagnetic site
kent_color – color of the Kent ellipse to plot (default is black)
nb – number of bootstrapped pseudo-samples (default is 1000)
save – Boolean argument to save plots (default is False)
save_folder – path to folder in which plots should be saved (default is current directory)
fmt – specify format of saved plots (default is ‘svg’)
return_new_dirs – optional return of newly “unflattened” directions as di_block (default is False)
return_values –
optional return of all bootstrap result inclinations, elongations, and f factors (default is False)
if both return_new_dirs=True and return_values=True, the function will return di_block of new directions, inclinations, elongations,and f factors
figprefix – prefix string for the name of the figures to be saved
EI_color – color of the most elongation/inclination curve corresponding to the most frequent f value
resample_EI_color – color of all other elongation/inclination curves
vgp_nb – number of virtual geomagnetic poles to resample for each iteration, the total VGPs resampled will be vgp_nb*nb
cmap – matplotlib color map used for plotting corrected paleomagnetic directions
central_longitude – central point of pole projection (defaults are 0)
central_latitude – central point of pole projection (defaults are 0)
num_resample_to_plot – number of bootstrap resample elongation/inclination curves to plot (default to to plot all)
vgp_nb – number of vgp to resample using a Monte Carlo approach with each f factor
cmap – matplotlib color map for color-coding the directions and mean poles based on the f factor
EI_color – the color of the EI curve associated with the most frequent f value (rounded to 2 decimal points, default is red)
resample_EI_color – the color of the EI curves for all f values except for the most frequent f (default is grey)
resample_EI_alpha – the transparency of the EI curves for all f values except for the most frequent f (default is grey)
- Returns:
equal area plot of original directions
Elongation/inclination pairs as a function of f, data plus 25 bootstrap samples
Cumulative distribution of bootstrapped optimal inclinations plus uncertainties. Estimate from original data set plotted as solid line
Orientation of principle direction through unflattening
- Return type:
four plots
Note
If distribution does not have a solution, plot labeled: Pathological. Some bootstrap samples may have valid solutions and those are plotted in the CDFs and E/I plot.
- pmagpy.ipmag.fisher_angular_deviation(dec=None, inc=None, di_block=None, confidence=95)[source]#
The angle from the true mean within which a chosen percentage of directions lie can be calculated from the Fisher distribution. This function uses the calculated Fisher concentration parameter to estimate this angle from directional data. The 63 percent confidence interval is often called the angular standard deviation.
- Parameters:
dec – list of declinations or longitudes
inc – list of inclinations or latitudes or
di_block – a nested list of [dec,inc,1.0] A di_block can be provided instead of dec, inc lists in which case it will be used. Either dec, inc lists or a di_block need to be provided.
confidence – 50 percent, 63 percent or 95 percent (default is 95 percent)
- Returns:
- float
theta is returned which is the critical angle of interest from the mean which contains the percentage of directions specified by the confidence parameter
- pmagpy.ipmag.fisher_mean(dec=None, inc=None, di_block=None)[source]#
Calculates the Fisher mean and associated parameters from either a list of declination values and a separate list of inclination values or from a di_block (a nested list of [dec,inc,1.0]). Returns a dictionary with the Fisher mean and statistical parameters.
- Parameters:
dec – list of declinations or longitudes
inc – list of inclinations or latitudes or
di_block – a nested list of [dec,inc,1.0] A di_block can be provided instead of dec, inc lists in which case it will be used. Either dec, inc lists or a di_block need to be provided.
- Returns:
- dictionary
Dictionary containing the Fisher mean parameters. This dictionary can be printed in a more readable fashion using the
ipmag.print_direction_mean()
function if it is a directional mean oripmag.print_pole_mean()
function if it is a pole mean.
Examples
Use lists of declination and inclination to calculate a Fisher mean:
>>> ipmag.fisher_mean(dec=[140,127,142,136],inc=[21,23,19,22]) {'alpha95': 7.292891411309177, 'csd': 6.4097743211340896, 'dec': 136.30838974272072, 'inc': 21.347784026899987, 'k': 159.69251473636305, 'n': 4, 'r': 3.9812138971889026}
Use a di_block to calculate a Fisher mean (will give the same output as the example with the lists):
>>> ipmag.fisher_mean(di_block=[[140,21],[127,23],[142,19],[136,22]])
- pmagpy.ipmag.fisher_mean_resample(alpha95=20, n=100, dec=0, inc=90, di_block=True)[source]#
Generates resamples of directional means from a Fisher mean with a specified alpha95. Equivalent of sampling from the angular standard deviation.
- Parameters:
alpha95 – 95% confidence on mean direction (default is 5)
n – number of vectors to determine (default is 100)
dec – mean declination of distribution (default is 0)
inc – mean inclination of distribution (default is 90)
di_block – this function returns a nested list of [dec,inc,1.0] as the default if di_block = False it will return a list of dec and a list of inc
- Returns:
di_block, a nested list of [dec,inc,1.0] (default)
dec, inc, a list of dec and a list of inc (if di_block = False)
Examples
>>> ipmag.fisher_mean_resample(alpha95=5, n=5, dec=40, inc=60) >>> [[41.47587050719005, 62.44682515754436, 1.0], [33.738299775944085, 55.88461662263949, 1.0], [42.98707546462242, 60.21460942319564, 1.0], [28.282076485842992, 59.67015046929257, 1.0], [41.87081053973009, 57.18064045614739, 1.0]]
- pmagpy.ipmag.fishqq(lon=None, lat=None, di_block=None, plot=True, save=False, fmt='png', save_folder='.')[source]#
Test whether a distribution is Fisherian and make a corresponding Q-Q plot. The Q-Q plot shows the data plotted against the value expected from a Fisher distribution. The first plot is the uniform plot which is the Fisher model distribution in terms of longitude (declination). The second plot is the exponential plot which is the Fisher model distribution in terms of latitude (inclination). In addition to the plots, the test statistics Mu (uniform) and Me (exponential) are calculated and compared against the critical test values. If Mu or Me are too large in comparison to the test statistics, the hypothesis that the distribution is Fisherian is rejected (see Fisher et al., 1987). These test statistics are returned in a dictionary.
- Parameters:
lon – longitude or declination of the data
lat – latitude or inclination of the data or
di_block – a nested list of [dec,inc] A di_block can be provided in which case it will be used instead of dec, inc lists.
plot – boolean to decide whether to make a plot (default is True)
save – boolean to decide whether plot is saved (default is False)
save_folder – relative directory where plots will be saved (default is current directory, ‘.’)
fmt – format of saved plot (default is ‘png’)
- Returns:
- dictionary
lon, mean longitude (or declination)
lat, mean latitude (or inclination)
N, number of vectors
Mu, Mu test statistic value for the data
Mu_critical, critical value for Mu
Me, Me test statistic value for the data
Me_critical, critical value for Me
if the data has two modes with N >=10 (N and R) two of these dictionaries will be returned
Examples
In this example, directions are sampled from a Fisher distribution using
ipmag.fishrot
and then theipmag.fishqq
function is used to test whether that distribution is Fisherian:>>> directions = ipmag.fishrot(k=40, n=50, dec=200, inc=50) >>> ipmag.fishqq(di_block = directions) {'Dec': 199.73564290371894, 'Inc': 49.017612342358298, 'Me': 0.78330310031220352, 'Me_critical': 1.094, 'Mode': 'Mode 1', 'Mu': 0.69915926146177099, 'Mu_critical': 1.207, 'N': 50, 'Test_result': 'consistent with Fisherian model'}
The above example passed a di_block to the function as an input. Lists of paired declination and inclination can also be used as inputs. Here the directions di_block is unpacked to separate declination and inclination lists using the
ipmag.unpack_di_block
functionwhich are then used as input to fishqq:>>> dec_list, inc_list = ipmag.unpack_di_block(directions) >>> ipmag.fishqq(lon=dec_list, lat=inc_list)
- pmagpy.ipmag.fishrot(k=20, n=100, dec=0, inc=90, di_block=True)[source]#
Generates Fisher distributed unit vectors from a specified distribution using the pmag.py function pmag.fshdev() and pmag.dodirot_V() functions.
- Parameters:
k – kappa precision parameter (default is 20)
n – number of vectors to determine (default is 100)
dec – mean declination of distribution (default is 0)
inc (if di_block = False it will return a list of dec and a list of) – mean inclination of distribution (default is 90)
di_block – this function returns a nested list of [dec,inc,1.0] as the default
inc
- Returns:
di_block, a nested list of [dec,inc,1.0] (default)
dec, inc, a list of dec and a list of inc (if di_block = False)
Examples
>>> ipmag.fishrot(k=20, n=5, dec=40, inc=60) array([[55.30451720381376 , 56.186057037482435, 1. ], [25.593998008087908, 63.544360587984784, 1. ], [29.263675539971246, 54.58964868129066 , 1. ], [61.28572459596148 , 51.5004074156194 , 1. ], [55.20784339888985 , 54.186746152272484, 1. ]])
- pmagpy.ipmag.get_matrix(n_pos=6)[source]#
returns design matrix for anisotropy experiments
- Parameters:
n_pos – anisotropy experiment positions (default is 6, can be 6, 9 or 15)
- Returns:
matrix for n_pos of 6,9, or 15
- Matrices definitions:
A design matrix B np.dot(inv(np.dot(A.transpose(),A)),A.transpose()) tmpH is used for sigma calculation (9,15 measurements only)
Anisotropy tensor:
|Mx| |s1 s4 s6| |Bx| |My| = |s4 s2 s5| . |By| |Mz| |s6 s5 s3| |Bz|
A matrix (measurement matrix): Each mesurement yields three lines in “A” matrix
|Mi | |Bx 0 0 By 0 Bz| |s1| |Mi+1| = |0 By 0 Bx Bz 0 | . |s2| |Mi+2| |0 0 Bz 0 By Bx| |s3|
- pmagpy.ipmag.histplot(infile='', data=(), outfile='', xlab='x', binsize=False, norm=1, fmt='svg', save_plots=True, interactive=False)[source]#
makes histograms for data
- Parameters:
infile (str) – default “” input file name format: single variable
data (tuple) – list-like, default () list/array of values to plot if infile is not provided
outfile (str) – default “” name for plot, if not provided defaults to hist.FMT
xlab (str) – default ‘x’ label for x axis
binsize (int) – default False desired binsize. if not specified, an appropriate binsize will be calculated.
norm (int) – default 1 1: norm, 0: don’t norm, -1: show normed and non-normed axes
fmt (str) – default “svg” format for figures, [“svg”, “jpg”, “pdf”, “png”]
save_plots (bool) – default True if True, create and save all requested plots
interactive (bool) – default False interactively plot and display (this is best used on the command line only)
- pmagpy.ipmag.hysteresis_magic(output_dir_path='.', input_dir_path='', spec_file='specimens.txt', meas_file='measurements.txt', fmt='svg', save_plots=True, make_plots=True, pltspec='', n_specs=5, interactive=False)[source]#
Calculate hysteresis parameters and plot hysteresis data. Plotting may be called interactively with save_plots==False, or be suppressed entirely with make_plots==False.
- Parameters:
output_dir_path –
str, default “.” Note: if using Windows, all figures will be saved to working directly
not dir_path
input_dir_path – str path for intput file if different from output_dir_path (default is same)
spec_file – str, default “specimens.txt” output file to save hysteresis data
meas_file – str, default “measurements.txt” input measurement file
fmt – str, default “svg” format for figures, [svg, jpg, pdf, png]
save_plots – bool, default True if True, generate and save all requested plots
make_plots – bool, default True if False, skip making plots and just save hysteresis data (if False, save_plots will be set to False also)
pltspec – str, default “” specimen name to plot, otherwise will plot all specimens
n_specs – int number of specimens to plot, default 5 if you want to make all possible plots, specify “all”
interactive – bool, default False interactively plot and display for each specimen (this is best used on the command line or in the Python interpreter)
- Returns:
- Tuple
(True or False indicating if conversion was successful, output file names written)
- pmagpy.ipmag.hysteresis_magic2(path_to_file='.', hyst_file='rmag_hysteresis.txt', save=False, save_folder='.', fmt='svg', plots=True)[source]#
Calculates hysteresis parameters, saves them in rmag_hysteresis format file. If selected, this function also plots hysteresis loops, delta M curves, d (Delta M)/dB curves, and IRM backfield curves.
- Parameters:
path_to_file – path to directory that contains files (default is current directory, ‘.’)
hyst_file – hysteresis file (default is ‘rmag_hysteresis.txt’)
save – boolean argument to save plots (default is False)
save_folder – relative directory where plots will be saved (default is current directory, ‘.’)
fmt – format of saved figures (default is ‘pdf’)
plots – whether or not to display the plots (default is true)
- pmagpy.ipmag.igrf(input_list, mod='', ghfile='')[source]#
Determine declination, inclination and intensity from a geomagnetic field model. The default model used is the IGRF model (http://www.ngdc.noaa.gov/IAGA/vmod/igrf.html) with other models available for selection with the available options detailed in the mod parameter below.
- Parameters:
input_list – list with format [Date, Altitude, Latitude, Longitude] date must be in decimal year format XXXX.XXXX (Common Era) altitude is in kilometers
mod –
desired model “” : Use the IGRF13 model by default ‘custom’ : use values supplied in ghfile or choose from this list [‘arch3k’,’cals3k’,’pfm9k’,’hfm10k’,’cals10k.2’,’cals10k.1b’,’shadif14’,’shawq2k’,’shawqIA’,’ggf100k’] where:
arch3k (Korte et al., 2009)
cals3k (Korte and Constable, 2011)
cals10k.1b (Korte et al., 2011)
pfm9k (Nilsson et al., 2014)
hfm10k is the hfm.OL1.A1 of Constable et al. (2016)
cals10k.2 (Constable et al., 2016)
shadif14 (Pavon-Carrasco et al., 2014)
shawq2k (Campuzano et al., 2019)
shawqIA (Osete et al., 2020)
ggk100k (Panovska et al., 2018)[only models from -99950 in 200 year increments allowed)
the first four of these models, are constrained to agree
with gufm1 (Jackson et al., 2000) for the past four centuries
gh – path to file with l m g h data
- Returns:
- igrf_array
array of magnetic field values (0: dec; 1: inc; 2: intensity (in nT))
Examples
>>> local_field = ipmag.igrf([2013.6544, .052, 37.87, -122.27]) >>> local_field array([1.431355648576314e+01, 6.148304376287219e+01, 4.899264739340517e+04]) >>> ipmag.igrf_print(local_field) Declination: 14.314 Inclination: 61.483 Intensity: 48992.647 nT
- pmagpy.ipmag.igrf_print(igrf_array)[source]#
Print out Declination, Inclination, Intensity from an array returned from the ipmag.igrf() function.
- Parameters:
igrf_array – array that is output from ipmag.igrf() function
Examples
An array generated by the
ipmag.igrf()
function is passed toipmag.igrf_print()
>>> local_field = ipmag.igrf([2013.6544, .052, 37.87, -122.27]) >>> ipmag.igrf_print(local_field) Declination: 14.314 Inclination: 61.483 Intensity: 48992.647 nT
- pmagpy.ipmag.inc_from_lat(lat)[source]#
Calculate inclination predicted from latitude using the dipole equation.
- Parameter:
lat : latitude in degrees
- Returns:
inclination calculated from latitude using the dipole equation
Examples
Calculate the inclination implied by an latitude of 45 degrees: >>> ipmag.inc_from_lat(45) 63.434948822922
- pmagpy.ipmag.iplot_hys(fignum, B, M, s)[source]#
function to plot hysteresis data
This function has been adapted from pmagplotlib.iplot_hys for specific use within a Jupyter notebook.
- Parameters:
fignum – reference number for matplotlib figure being created
B – list of B (flux density) values of hysteresis experiment
M – list of M (magnetization) values of hysteresis experiment
s – specimen name
- pmagpy.ipmag.kent_distribution_95(dec=None, inc=None, di_block=None)[source]#
Calculates the Kent mean and and provides the parameters associated with the region containing 95% of the directions from either a list of declination values and a separate list of inclination values or from a di_block (a nested list a nested list of [dec,inc,1.0]). Returns a dictionary with the Kent mean and statistical parameters.
Parameters: dec: list of declinations inc: list of inclinations di_block: a nested list of [dec,inc,1.0]
A di_block can be provided instead of dec, inc lists in which case it will be used. Either dec, inc lists or a di_block need to passed to the function.
- Returns:
dictionary containing Kent mean and associated statistics.
Examples
Use lists of declination and inclination to calculate a Kent mean:
>>> ipmag.kent_distribution_95(dec=[140,127,142,136],inc=[21,23,19,22]) {'dec': 136.30838974272072, 'inc': 21.347784026899987, 'n': 4, 'Zdec': 40.82469002841276, 'Zinc': 13.739412321974067, 'Edec': 280.38683553668795, 'Einc': 64.23659892174429, 'Zeta': 13.677129096579478, 'Eta': 1.4597607031196376} Use a di_block to calculate a Kent mean (will give the same output as the example with the lists):
>>> ipmag.kent_distribution_95(di_block=[[140,21],[127,23],[142,19],[136,22]])
- pmagpy.ipmag.kent_mean(dec=None, inc=None, di_block=None)[source]#
Calculates the Kent mean and associated statistical parameters from either a list of declination values and a separate list of inclination values or from a di_block (a nested list a nested list of [dec,inc,1.0]). Returns a dictionary with the Kent mean and statistical parameters.
- Parameters:
dec – list of declinations
inc – list of inclinations or
di_block – a nested list of [dec,inc,1.0] A di_block can be provided instead of dec, inc lists in which case it will be used. Either dec, inc lists or a di_block need to passed to the function.
- Returns:
dictionary containing Kent mean and associated statistics.
Examples
Use lists of declination and inclination to calculate a Kent mean:
>>> ipmag.kent_mean(dec=[140,127,142,136],inc=[21,23,19,22]) {'Edec': 280.38683553668795, 'Einc': 64.236598921744289, 'Eta': 0.72982112760919715, 'Zdec': 40.824690028412761, 'Zeta': 6.7896823241008795, 'Zinc': 13.739412321974067, 'dec': 136.30838974272072, 'inc': 21.347784026899987, 'n': 4}
Use a di_block to calculate a Kent mean (will give the same output as the example above with the dec, inc lists):
>>> ipmag.kent_mean(di_block=[[140,21],[127,23],[142,19],[136,22]])
- pmagpy.ipmag.kentrot(kent_dict, n=100, di_block=True)[source]#
Generates Kent distributed unit vectors from a specified distribution using the pmag.py function pmag.kentdev().
- Parameters:
kent_dict – a dictionary for Kent distribution parameters. It should at least have: dec: mean axis dec, inc: mean axis inc, Zdec: major axis dec, Zinc: major axis inc, Edec: minor axis dec, Einc: minor axis inc, R1: Kent distribution size quantity for calculating kappa and beta, R2: Kent distribution shape quantity for calculating kappa and beta}
di_block – this function returns a nested list of [dec,inc,1.0] as the default
inc (if di_block = False it will return a list of dec and a list of)
- Returns:
di_block, a nested list of [dec,inc,1.0] (default) dec, inc, a list of dec and a list of inc (if di_block = False)
- pmagpy.ipmag.lat_from_inc(inc, a95=None)[source]#
Calculate paleolatitude from inclination using the dipole equation.
- Parameter:
inc: (paleo)magnetic inclination in degrees a95: 95% confidence interval from Fisher mean
- Returns:
if a95 is provided paleo_lat, paleo_lat_max, paleo_lat_min are returned otherwise, it just returns paleo_lat
Examples
Calculate the paleolatitude implied by an inclination of 45 degrees:
>>> ipmag.lat_from_inc(45) 26.56505117707799
Calculate the paleolatitude and the maximum and minimum paleolatitude implied by an inclination of 20 degrees with an uncertainty on the mean (a95) of 5:
>>> ipmag.lat_from_inc(20, a95=5) (10.314104815618196, 13.12426812279171, 7.630740212430057)
- pmagpy.ipmag.lat_from_pole(ref_loc_lon, ref_loc_lat, pole_plon, pole_plat)[source]#
Calculate paleolatitude for a reference location based on a paleomagnetic pole.
- Parameters:
ref_loc_lon – longitude of reference location in degrees E
ref_loc_lat – latitude of reference location in degrees N
pole_plon – paleopole longitude in degrees in degrees E
pole_plat – paleopole latitude in degrees in degrees N
- Returns:
paleolatitude for location based on pole
- pmagpy.ipmag.make_di_block(dec, inc, unit_vector=True)[source]#
Some pmag.py and ipmag.py functions require or will take a list of unit vectors [dec,inc,1.] as input. This function takes declination and inclination data and make it into such a nested list of lists.
- Parameters:
dec – list of declinations
inc – list of inclinations
unit_vector – if True will return [dec,inc,1.]; if False will return [dec,inc]
- Returns:
- di_block
nested list of declination, inclination lists
Examples
>>> decs = [180.3, 179.2, 177.2] >>> incs = [12.1, 13.7, 11.9] >>> ipmag.make_di_block(decs,incs) [[180.3, 12.1, 1.0], [179.2, 13.7, 1.0], [177.2, 11.9, 1.0]]
- pmagpy.ipmag.make_diddd_array(dec, inc, dip_direction, dip)[source]#
- Some pmag.py functions such as the bootstrap fold test require a numpy array
of dec, inc, dip direction, dip [dec, inc, dd, dip] as input. This function makes such an array.
- Parameters:
dec – paleomagnetic declination in degrees
inc – paleomagnetic inclination in degrees
dip_direction – the dip direction of bedding (in degrees between 0 and 360)
dip – dip of bedding (in degrees)
- Returns:
- array
an array of [dec, inc, dip_direction, dip]
Examples
Data in separate lists of dec, inc, dip_direction, dip data can be made into an array.
>>> dec = [132.5,124.3,142.7,130.3,163.2] >>> inc = [12.1,23.2,34.2,37.7,32.6] >>> dip_direction = [265.0,265.0,265.0,164.0,164.0] >>> dip = [20.0,20.0,20.0,72.0,72.0] >>> data_array = ipmag.make_diddd_array(dec,inc,dip_direction,dip) >>> data_array array([[ 132.5, 12.1, 265. , 20. ], [ 124.3, 23.2, 265. , 20. ], [ 142.7, 34.2, 265. , 20. ], [ 130.3, 37.7, 164. , 72. ], [ 163.2, 32.6, 164. , 72. ]])
- pmagpy.ipmag.make_mollweide_map(central_longitude=0, figsize=(8, 8), add_land=True, land_color='tan', land_edge_color='black', add_ocean=False, ocean_color='lightblue', grid_lines=True, lat_grid=[-180.0, -150.0, -120.0, -90.0, -60.0, -30.0, 0.0, 30.0, 60.0, 90.0, 120.0, 150.0, 180.0], lon_grid=[-180.0, -150.0, -120.0, -90.0, -60.0, -30.0, 0.0, 30.0, 60.0, 90.0, 120.0, 150.0, 180.0])[source]#
Function creates and returns a Mollweide map projection using cartopy
- Parameters:
central_longitude – central longitude of projection (default is 0)
central_latitude – central latitude of projection (default is 0)
figsize – size of the figure (default is 8x8)
add_land – chose whether land is plotted on map (default is True)
land_color – specify land color (default is ‘tan’)
add_ocean – chose whether land is plotted on map (default is False, change to True to plot)
ocean_color – specify ocean color (default is ‘lightblue’)
grid_lines – chose whether grid lines are plotted on map (default is true)
lat_grid – specify the latitude grid (default is 30 degree spacing)
lon_grid – specify the longitude grid (default is 30 degree spacing)
Examples
>>> map_axis = make_mollweide_map(central_longitude=200)
- pmagpy.ipmag.make_orthographic_map(central_longitude=0, central_latitude=0, figsize=(8, 8), add_land=True, land_color='tan', land_edge_color='black', add_ocean=False, ocean_color='lightblue', grid_lines=True, lat_grid=[-80.0, -60.0, -30.0, 0.0, 30.0, 60.0, 80.0], lon_grid=[-180.0, -150.0, -120.0, -90.0, -60.0, -30.0, 0.0, 30.0, 60.0, 90.0, 120.0, 150.0, 180.0])[source]#
Function creates and returns an orthographic map projection using cartopy
- Parameters:
central_longitude – central longitude of projection (default is 0)
central_latitude – central latitude of projection (default is 0)
figsize – size of the figure (default is 8x8)
add_land – chose whether land is plotted on map (default is true)
land_color – specify land color (default is ‘tan’)
add_ocean – chose whether land is plotted on map (default is False, change to True to plot)
ocean_color – specify ocean color (default is ‘lightblue’)
grid_lines – chose whether grid lines are plotted on map (default is true)
lat_grid – specify the latitude grid (default is 30 degree spacing)
lon_grid – specify the longitude grid (default is 30 degree spacing)
Examples
>>> map_axis = make_orthographic_map(central_longitude=200,central_latitude=30)
- pmagpy.ipmag.make_robinson_map(central_longitude=0, figsize=(8, 8), add_land=True, land_color='tan', add_ocean=False, ocean_color='lightblue', grid_lines=True, lat_grid=[-180.0, -150.0, -120.0, -90.0, -60.0, -30.0, 0.0, 30.0, 60.0, 90.0, 120.0, 150.0, 180.0], lon_grid=[-180.0, -150.0, -120.0, -90.0, -60.0, -30.0, 0.0, 30.0, 60.0, 90.0, 120.0, 150.0, 180.0])[source]#
Function creates and returns a Robinson map projection using cartopy
- Parameters:
central_longitude – central longitude of projection (default is 0)
central_latitude – central latitude of projection (default is 0)
figsize – size of the figure (default is 8x8)
add_land – chose whether land is plotted on map (default is True)
land_color – specify land color (default is ‘tan’)
add_ocean – chose whether land is plotted on map (default is False, change to True to plot)
ocean_color – specify ocean color (default is ‘lightblue’)
grid_lines – chose whether grid lines are plotted on map (default is true)
lat_grid – specify the latitude grid (default is 30 degree spacing)
lon_grid – specify the longitude grid (default is 30 degree spacing)
Examples
>>> map_axis = make_Robinson_map(central_longitude=200)
- pmagpy.ipmag.mean_bootstrap_confidence(dec=None, inc=None, di_block=None, num_sims=10000, alpha=0.05)[source]#
Estimates the bootstrap confidence region for the mean of a collection of paleomagnetic directions based on the approach of Heslop et al. (2023). This approach involves the projection onto a tangent plane for a tractable statistical analysis in two dimensions.
- Parameters:
dec (list or None) – List of declination values. Default is None.
inc (list or None) – List of inclination values. Default is None.
di_block (list or None) – List of [dec, inc] pairs. Default is None. A di_block can be provided instead of dec, inc lists in which case it will be used. Either dec, inc lists or a di_block needs to be provided.
num_sims (int) – Number of bootstrap iterations. Default is 10,000.
alpha (float) – Confidence region. Default is 0.05 corresponding to 95% region.
- Returns:
- A tuple containing:
(1) a dictionary of parameters the includes the estimated mean direction and the T statistic which is the basis of the bootstrap confidence region, (2) list of [dec, inc] pairs that represent the boundary of the confidence region. The bootstrap confidence region cannot be reported readily in a compact form so is instead a long list of points along the boundary of the confidence region.
- Return type:
tuple
References
Heslop, D., Scealy, J. L., Wood, A. T. A., Tauxe, L., & Roberts, A. P. (2023). A bootstrap common mean direction test. Journal of Geophysical Research: Solid Earth, 128, e2023JB026983. https://doi.org/10.1029/2023JB026983
- pmagpy.ipmag.orientation_magic(or_con=1, dec_correction_con=1, dec_correction=0, bed_correction=True, samp_con='1', hours_from_gmt=0, method_codes='', average_bedding=False, orient_file='orient.txt', samp_file='samples.txt', site_file='sites.txt', output_dir_path='.', input_dir_path='', append=False, data_model=3)[source]#
use this function to convert tab delimited field notebook information to MagIC formatted tables (er_samples and er_sites)
- INPUT FORMAT
Input files must be tab delimited and have in the first line:
- tab location_name
- Note: The “location_name” will facilitate searching in the MagIC database. Data from different
- “locations” should be put in separate files. The definition of a “location” is rather loose.
Also this is the word ‘tab’ not a tab, which will be indicated by ‘ ‘.
The second line has the names of the columns (tab delimited), e.g.: site_name sample_name mag_azimuth field_dip date lat long sample_lithology sample_type sample_class shadow_angle hhmm stratigraphic_height bedding_dip_direction bedding_dip GPS_baseline image_name image_look image_photographer participants method_codes site_description sample_description GPS_Az, sample_igsn, sample_texture, sample_cooling_rate, cooling_rate_corr, cooling_rate_mcd
defaults: orientation_magic(or_con=1, dec_correction_con=1, dec_correction=0, bed_correction=True, samp_con=’1’, hours_from_gmt=0, method_codes=’’, average_bedding=False, orient_file=’orient.txt’, samp_file=’er_samples.txt’, site_file=’er_sites.txt’, output_dir_path=’.’, input_dir_path=’’, append=False): orientation conventions:
- [1] Standard Pomeroy convention of azimuth and hade (degrees from vertical down)
of the drill direction (field arrow). lab arrow azimuth= sample_azimuth = mag_azimuth; lab arrow dip = sample_dip =-field_dip. i.e. the lab arrow dip is minus the hade.
- [2] Field arrow is the strike of the plane orthogonal to the drill direction,
Field dip is the hade of the drill direction. Lab arrow azimuth = mag_azimuth-90 Lab arrow dip = -field_dip
- [3] Lab arrow is the same as the drill direction;
hade was measured in the field. Lab arrow azimuth = mag_azimuth; Lab arrow dip = 90-field_dip
[4] lab azimuth and dip are same as mag_azimuth, field_dip : use this for unoriented samples too [5] Same as AZDIP convention explained below -
azimuth and inclination of the drill direction are mag_azimuth and field_dip; lab arrow is as in [1] above. lab azimuth is same as mag_azimuth,lab arrow dip=field_dip-90
[6] Lab arrow azimuth = mag_azimuth-90; Lab arrow dip = 90-field_dip [7] see http://earthref.org/PmagPy/cookbook/#field_info for more information. You can customize other format yourself, or email ltauxe@ucsd.edu for help. [8] Lab arrow azimuth = mag_azimuth-180; Lab arrow dip = 90-field_dip
- Magnetic declination convention:
[1] Use the IGRF value at the lat/long and date supplied [default] [2] Will supply declination correction [3] mag_az is already corrected in file [4] Correct mag_az but not bedding_dip_dir
- Sample naming convention:
- [1] XXXXY: where XXXX is an arbitrary length site designation and Y
is the single character sample designation. e.g., TG001a is the first sample from site TG001. [default]
[2] XXXX-YY: YY sample from site XXXX (XXX, YY of arbitrary length) [3] XXXX.YY: YY sample from site XXXX (XXX, YY of arbitrary length) [4-Z] XXXX[YYY]: YYY is sample designation with Z characters from site XXX [5] site name = sample name [6] site name entered in site_name column in the orient.txt format input file – NOT CURRENTLY SUPPORTED [7-Z] [XXX]YYY: XXX is site designation with Z characters from samples XXXYYY NB: all others you will have to either customize your
self or e-mail ltauxe@ucsd.edu for help.
Note
column order doesn’t matter but the NAMES do.
sample_name, sample_lithology, sample_type, sample_class, lat and long are required. all others are optional.
- If subsequent data are the same (e.g., date, bedding orientation, participants, stratigraphic_height),
you can leave the field blank and the program will fill in the last recorded information. BUT if you really want a blank stratigraphic_height, enter a ‘-1’. These will not be inherited and must be specified for each entry: image_name, look, photographer or method_codes
hhmm must be in the format: hh:mm and the hh must be in 24 hour time.
- date must be mm/dd/yy (years < 50 will be converted to 20yy and >50 will be assumed 19yy). hours_from_gmt is the number of hours to SUBTRACT from hh to get to GMT.
image_name, image_look and image_photographer are colon delimited lists of file name (e.g., IMG_001.jpg) image look direction and the name of the photographer respectively. If all images had same look and photographer, just enter info once. The images will be assigned to the site for which they were taken - not at the sample level.
participants: Names of who helped take the samples. These must be a colon delimited list.
- method_codes: Special method codes on a sample level, e.g., SO-GT5 which means the orientation is has an uncertainty of >5 degrees
for example if it broke off before orienting….
GPS_Az is the place to put directly determined GPS Azimuths, using, e.g., points along the drill direction.
sample_cooling_rate is the cooling rate in K per Ma
int_corr_cooling_rate
cooling_rate_mcd: data adjustment method code for cooling rate correction; DA-CR-EG is educated guess; DA-CR-PS is percent estimated from pilot samples; DA-CR-TRM is comparison between 2 TRMs acquired with slow and rapid cooling rates. is the percent cooling rate factor to apply to specimens from this sample, DA-CR-XX is the method code
- pmagpy.ipmag.plate_rate_mc(pole1_plon, pole1_plat, pole1_kappa, pole1_N, pole1_age, pole1_age_error, pole2_plon, pole2_plat, pole2_kappa, pole2_N, pole2_age, pole2_age_error, ref_loc_lon, ref_loc_lat, samplesize=10000, random_seed=None, plot=True, savefig=True, save_directory='./', figure_name='')[source]#
Determine the latitudinal motion implied by a pair of poles and utilize the Monte Carlo sampling method of Swanson-Hysell (2014) to determine the associated uncertainty.
- Parameters:=
plon : longitude of pole plat : latitude of pole kappa : Fisher precision parameter for VPGs in pole N : number of VGPs in pole age : age assigned to pole in Ma age_error : 1 sigma age uncertainty in million years ref_loc_lon : longitude of reference location ref_loc_lat : latitude of reference location samplesize : number of draws from pole and age distributions (default set to 10000) random_seed : set random seed for reproducible number generation (default is None) plot : whether to make figures (default is True, optional) savefig : whether to save figures (default is True, optional) save_directory = default is local directory (optional) figure_name = prefix for file names (optional)
- Returns:
rate of latitudinal motion in cm/yr along with estimated 2.5 and 97.5
percentile rate estimates
- pmagpy.ipmag.plot_bootstrap_confidence(mean_dec, mean_inc, confidence_DI, mean_color='k', confidence_color='k', mean_marker='o', confidence_marker='.', mean_markersize=20, confidence_markersize=1)[source]#
Plot mean and bootstrap confidence outline on an equal area plot. The input confidence_DI is the output from the mean_bootstrap_confidence() function.
Before this function is called a plot needs to be initialized with code that looks something like: >fignum = 1 >plt.figure(num=fignum,figsize=(10,10),dpi=160) >ipmag.plot_net(fignum)
- Parameters:
mean_dec – Declination of the mean point.
mean_inc – Inclination of the mean point.
confidence_DI – A nested list of [dec, inc, 1.0] representing the bootstrap confidence.
mean_color – Color of the mean point. Default is black.
confidence_color – Color of the confidence points. Default is black.
mean_marker – Marker style for the mean point. Default is ‘o’ (circle).
confidence_marker – Marker style for the confidence points. Default is ‘o’ (circle).
mean_markersize – Marker size for the mean point. Default is 20.
confidence_markersize – Marker size for the confidence points. Default is 1.
- pmagpy.ipmag.plot_di(dec=None, inc=None, di_block=None, color='k', marker='o', markersize=20, legend='no', label='', connect_points=False, lw=0.25, lc='k', la=0.5, title=None, edge=None, alpha=1, zorder=2)[source]#
Plot declination, inclination data on an equal area plot.
Before this function is called a plot needs to be initialized with code that looks something like: >fignum = 1 >plt.figure(num=fignum,figsize=(10,10),dpi=160) >ipmag.plot_net(fignum)
- Parameters:
dec – declination being plotted
inc – inclination being plotted
di_block – a nested list of [dec,inc,1.0] (di_block can be provided instead of dec, inc in which case it will be used)
color – the default color is black. Other colors can be chosen (e.g. ‘r’)
marker – the default marker is a circle (‘o’)
markersize – default size is 20
legend – the default is no legend (‘no’). Putting ‘yes’ will plot a legend.
label – the default label is blank (‘’)
connect_points – option to connect points in order of plotting, default is False
lw – linewidth of connecting lines
lc – color of connecting lines
la – alpha of connecting lines
title – the default title is False
edge – marker edge color - if blank, is color of marker
alpha – opacity
zorder – zorder of marker
- pmagpy.ipmag.plot_di_mean(dec, inc, a95, color='k', marker='o', markersize=20, label='', legend='no', zorder=2)[source]#
Plot a mean direction (declination, inclination) with alpha_95 ellipse on an equal area plot.
Before this function is called, a plot needs to be initialized with code that looks something like: >fignum = 1 >plt.figure(num=fignum,figsize=(10,10),dpi=160) >ipmag.plot_net(fignum)
- Parameters:
dec – declination of mean being plotted
inc – inclination of mean being plotted
a95 – a95 confidence ellipse of mean being plotted
color – the default color is black. Other colors can be chosen (e.g. ‘r’).
marker – the default is a circle. Other symbols can be chosen (e.g. ‘s’).
markersize – the default is 20. Other sizes can be chosen.
label – the default is no label. Labels can be assigned.
legend – the default is no legend (‘no’). Putting ‘yes’ will plot a legend.
zorder – zorder of marker
- pmagpy.ipmag.plot_di_mean_bingham(bingham_dictionary, fignum=1, color='k', marker='o', markersize=20, label='', legend='no')[source]#
see plot_di_mean_ellipse
- pmagpy.ipmag.plot_di_mean_ellipse(dictionary, fignum=1, color='k', marker='o', markersize=20, label='', legend='no')[source]#
Plot a mean direction (declination, inclination) confidence ellipse.
- Parameters:
dictionary – a dictionary generated by the pmag.dobingham or pmag.dokent functions
- pmagpy.ipmag.plot_distributions(ax, lon_samples, lat_samples, to_plot='d', resolution=100, **kwargs)[source]#
plot distributions of a group of vectors on a unit sphere
- Parameters:
ax – matplotlib axis
lon_samples – a list or array of longitude samples
lat_samples – a list or array of latitude samples
to_plot – the type of distribution plot to show, can be ‘d’ as colormesh, ‘e’ as contour, ‘s’ as discrete scatter plots
resolution – the resolution at which to plot the distributions
kwargs – other keyword arguments inherited from matplotlib
- pmagpy.ipmag.plot_dmag(data='', title='', fignum=1, norm=1, dmag_key='treat_ac_field', intensity='', quality=False)[source]#
plots demagenetization data versus step for all specimens in pandas dataframe datablock
- Parameters:
data – Pandas dataframe with MagIC data model 3 columns:
fignum – figure number
specimen – specimen name
dmag_key – one of these: [‘treat_temp’,’treat_ac_field’,’treat_mw_energy’] selected using method_codes : [‘LT_T-Z’,’LT-AF-Z’,’LT-M-Z’] respectively
intensity – if blank will choose one of these: [‘magn_moment’, ‘magn_volume’, ‘magn_mass’]
quality – if True use the quality column of the DataFrame
title – title for plot
norm – if True, normalize data to first step
- Returns:
matptlotlib plot
- pmagpy.ipmag.plot_gc(poles, color='g', fignum=1)[source]#
plots a great circle on an equal area projection
- Parameters:
fignum – number of matplotlib object
poles – nested list of [Dec,Inc] pairs of poles
color – color of lower hemisphere dots for great circle - must be in form: ‘g’,’r’,’y’,’k’,etc. upper hemisphere is always cyan
- pmagpy.ipmag.plot_net(fignum=None, tick_spacing=10, ax=None)[source]#
Draws circle and tick marks for equal area projection.
- Parameters:
fignum – int or None Figure number to use for creating a new figure if no axis is provided.
tick_spacing – int Interval for declination tick marks, default is 10.
ax – matplotlib.axes.Axes or None Axis to plot on. If None, the current axis will be used (or created if fignum is given).
- pmagpy.ipmag.plot_pole(map_axis, plon, plat, A95, label='', color='k', edgecolor='k', marker='o', markersize=20, legend='no', outline=True, filled_pole=False, fill_color='k', fill_alpha=1.0, mean_alpha=1.0, A95_alpha=1.0, zorder=100)[source]#
This function plots a paleomagnetic pole and A95 error ellipse on a cartopy map axis.
Before this function is called, a plot needs to be initialized with code such as that in the make_orthographic_map function.
- Parameters:
map_axis – the name of the current map axis that has been developed using cartopy
plon – the longitude of the paleomagnetic pole being plotted (in degrees E)
plat – the latitude of the paleomagnetic pole being plotted (in degrees)
A95 – the A_95 confidence ellipse of the paleomagnetic pole (in degrees)
color – symbol color; the default color is black. Other colors can be chosen (e.g. ‘r’)
marker – the default marker is a circle. Other symbols can be chosen (e.g. ‘s’)
markersize – the default is 20. Other size can be chosen
label – the default is no label. Labels can be assigned.
legend – the default is no legend (‘no’). Putting ‘yes’ will plot a legend.
filled_pole – if True, the A95 ellipse will be filled with color
fill_color – color of fill; the default is black.
fill_alpha – transparency of filled ellipse (the default is 1.0; no transparency).
mean_alpha – transparency of pole mean (the default is 1.0; no transparency).
zorder – plotting order (default is 100; higher will move to top of plot)
Examples
>>> plon = 200 >>> plat = 60 >>> A95 = 6 >>> map_axis = ipmag.make_orthographic_map(central_longitude=200,central_latitude=30) >>> ipmag.plot_pole(map_axis, plon, plat, A95 ,color='red',markersize=40, zorder=20)
- pmagpy.ipmag.plot_pole_dp_dm(map_axis, plon, plat, slon, slat, dp, dm, pole_label='pole', site_label='site', pole_color='k', pole_edgecolor='k', pole_marker='o', site_color='r', site_edgecolor='r', site_marker='s', markersize=20, legend=True, transform=<Projected CRS: +proj=eqc +ellps=WGS84 +a=6378137.0 +lon_0=0.0 +to ...> Name: unknown Axis Info [cartesian]: - E[east]: Easting (unknown) - N[north]: Northing (unknown) - h[up]: Ellipsoidal height (metre) Area of Use: - undefined Coordinate Operation: - name: unknown - method: Equidistant Cylindrical Datum: Unknown based on WGS 84 ellipsoid - Ellipsoid: WGS 84 - Prime Meridian: Greenwich)[source]#
This function plots a paleomagnetic pole and a dp/dm confidence ellipse on a cartopy map axis.
Before this function is called, a plot needs to be initialized with code such as that in the make_orthographic_map function.
- Parameters:
map_axis – the name of the current map axis that has been developed using cartopy
plon – the longitude of the paleomagnetic pole being plotted (in degrees E)
plat – the latitude of the paleomagnetic pole being plotted (in degrees)
slon – the longitude of the site (in degrees E)
slat – the latitude of the site (in degrees)
dp – the semi-minor axis of the confidence ellipse (in degrees)
dm – the semi-major axis of the confidence ellipse (in degrees)
pole_color – the default color is black. Other colors can be chosen (e.g. ‘g’)
site_color – the default color is red. Other colors can be chosen (e.g. ‘g’)
pole_marker – the default is a circle. Other symbols can be chosen (e.g. ‘s’)
site_marker – the default is a square. Other symbols can be chosen (e.g. ‘^’)
markersize – the default is 20. Other size can be chosen
pole_label – string that labels the pole.
site_label – string that labels the site
legend – the default is a legend (True). Putting False will suppress legend plotting.
transform – the default is the PlateCarree transform in Cartopy. Other transforms can be chosen (e.g. ccrs.geodetic), but this parameter rarely needs to be changed by the user and is included for completeness and in case of artifacts arising from the PlateCarree transform on some map projections in which case the Geodetic transform may work better.
Examples
>>> dec = 280 >>> inc = 45 >>> a95 = 5 >>> site_lat = 45 >>> site_lon = -100 >>> pole = pmag.dia_vgp(dec, inc, a95, site_lat, site_lon) >>> pole_lon = pole[0] >>> pole_lat = pole[1] >>> dp = pole[2] >>> dm = pole[3]
>>> map_axis = ipmag.make_orthographic_map(central_longitude=200,central_latitude=30) >>> ipmag.plot_pole_dp_dm(map_axis,pole_lon,pole_lat,site_lon,site_lat,dp,dm)
- pmagpy.ipmag.plot_pole_ellipse(map_axis, dictionary, color='k', edgecolor='k', marker='s', markersize=20, label='', alpha=1.0, lw=1, lower=True, zorder=100)[source]#
Plot a mean pole confidence ellipse associated with a Kent distribution
- Parameters:
map_axis – the name of the current map axis that has been developed using cartopy
dictionary – a dictionary generated by the pmag.dobingham or pmag.dokent functions
color – symbol color; the default color is black. Other colors can be chosen (e.g. ‘r’)
marker – the default marker is a circle. Other symbols can be chosen (e.g. ‘s’)
markersize – the default is 20. Other size can be chosen
label – the default is no label. Labels can be assigned.
legend – the default is no legend (‘no’). Putting ‘yes’ will plot a legend.
filled_pole – if True, the A95 ellipse will be filled with color
fill_color – color of fill; the default is black.
fill_alpha – transparency of filled ellipse (the default is 1.0; no transparency).
lower – hemisphere to plot the ellipse when calling function pmagplotlib.plot_ell (default is True)
zorder – plotting order (default is 100; higher will move to top of plot)
Examples
>>> kent_dict = {'dec': 287.53798364307437, 'inc': 88.56067392991959, 'n': 5, 'Zdec': 54.83073632264832, 'Zinc': 0.8721861867684042, 'Edec': 144.84816793561657, 'Einc': 1.1448791390804505, 'Zeta': 4.640345964184263, 'Eta': 6.8378968512569465, 'R1': 0.9914595207919079, 'R2': 0.006259515780690272} >>> map_axis = ipmag.make_orthographic_map(central_longitude=200,central_latitude=90) >>> ipmag.plot_pole_ellipse(map_axis,kent_dict, color='red',markersize=40)
- pmagpy.ipmag.plot_poles(map_axis, plon, plat, A95, label='', color='k', edgecolor='k', marker='o', markersize=20, legend='no', outline=True, filled_pole=False, fill_color='k', fill_alpha=1.0, alpha=1.0, zorder=101, lw=1)[source]#
This function plots paleomagnetic poles and A95 error ellipses on a cartopy map axis.
Before this function is called, a plot needs to be initialized with code such as that in the make_orthographic_map function.
- Parameters:
map_axis – the name of the current map axis that has been developed using cartopy
plon – the longitude of the paleomagnetic pole being plotted (in degrees E)
plat – the latitude of the paleomagnetic pole being plotted (in degrees)
A95 – the A_95 confidence ellipse of the paleomagnetic pole (in degrees)
color – the default color is black. Other colors can be chosen (e.g. ‘r’) a list of colors can also be given so that each pole has a distinct color
edgecolor – the default edgecolor is black. Other colors can be chosen (e.g. ‘r’)
marker – the default is a circle. Other symbols can be chosen (e.g. ‘s’)
markersize – the default is 20. Other size can be chosen
label – the default is no label. Labels can be assigned.
legend – the default is no legend (‘no’). Putting ‘yes’ will plot a legend.
filled_pole – if True, the A95 ellipse will be filled with color
fill_color – color of fill; the default is black.
fill_alpha – transparency of filled ellipse (the default is 1.0; no transparency).
alpha – transparency of pole mean (the default is 1.0; no transparency).
zorder – plotting order (default is 100; higher will move to top of plot)
Examples
>>> plons = [200, 180, 210] >>> plats = [60, 40, 35] >>> A95s = [6, 3, 10] >>> map_axis = ipmag.make_orthographic_map(central_longitude=200, central_latitude=30) >>> ipmag.plot_poles(map_axis, plons, plats, A95s, color='red', markersize=40)
>>> plons = [200, 180, 210] >>> plats = [60, 40, 35] >>> A95s = [6, 3, 10] >>> colors = ['red','green','blue'] >>> map_axis = ipmag.make_orthographic_map(central_longitude=200, central_latitude=30) >>> ipmag.plot_poles(map_axis, plons, plats, A95s, color=colors, markersize=40)
- pmagpy.ipmag.plot_poles_colorbar(map_axis, plons, plats, A95s, colorvalues, vmin, vmax, colormap='viridis', edgecolor='k', marker='o', markersize=20, alpha=1.0, colorbar=True, colorbar_label='pole age (Ma)', outline='True', filled_pole=False, fill_alpha=1.0, lw=1)[source]#
This function plots multiple paleomagnetic pole and A95 error ellipse on a cartopy map axis. The poles are colored by the defined colormap.
Before this function is called, a plot needs to be initialized with code such as that in the make_orthographic_map function.
- Parameters:
map_axis – the name of the current map axis that has been developed using cartopy
plons – the longitude of the paleomagnetic pole being plotted (in degrees E)
plats – the latitude of the paleomagnetic pole being plotted (in degrees)
A95s – the A_95 confidence ellipse of the paleomagnetic pole (in degrees)
colorvalues – what attribute is being used to determine the colors
vmin – what is the minimum range for the colormap
vmax – what is the maximum range for the colormap
colormap – the colormap used (default is ‘viridis’; others should be put as a string with quotes, e.g. ‘plasma’)
edgecolor – the color desired for the symbol outline
marker – the marker shape desired for the pole mean symbol (default is ‘o’ aka a circle)
colorbar – the default is to include a colorbar (True). Putting False will make it so no legend is plotted.
colorbar_label – label for the colorbar
Examples
>>> plons = [200, 180, 210] >>> plats = [60, 40, 35] >>> A95s = [6, 3, 10] >>> ages = [100,200,300] >>> vmin = 0 >>> vmax = 300 >>> map_axis = ipmag.make_orthographic_map(central_longitude=200, central_latitude=30) >>> ipmag.plot_poles_colorbar(map_axis, plons, plats, A95s, ages, vmin, vmax)
- pmagpy.ipmag.plot_vgp(map_axis, vgp_lon=None, vgp_lat=None, di_block=None, label='', color='k', marker='o', edge='black', markersize=20, alpha=1, legend=False, zorder=100)[source]#
This function plots a paleomagnetic pole position onto a cartopy map axis.
Before this function is called, a map plot needs to be initialized with code such as that in the
`ipmag.make_orthographic_map()`
function (see example below).- Parameters:
map_axis – the name of the current map axis that has been developed using cartopy
plon – the longitude of the paleomagnetic pole being plotted (in degrees E)
plat – the latitude of the paleomagnetic pole being plotted (in degrees)
color – the color desired for the symbol (default is ‘k’ aka black)
marker – the marker shape desired for the pole mean symbol (default is ‘o’ aka a circle)
edge – the color of the edge of the marker (default is black); can be set to None to have no edge
markersize – size of the marker in pt (default is 20)
alpha – the transparency of the points (defaul is 1 which is opaque, 0 is fully transparent)
label – the default is no label. Labels can be assigned.
legend – the default is no legend (False). Putting True will plot a legend.
zorder – plotting order (default is 100; higher will move to top of plot)
Examples
>>> vgps = ipmag.fishrot(dec=200,inc=30) >>> vgp_lon_list,vgp_lat_list,intensities= ipmag.unpack_di_block(vgps) >>> map_axis = ipmag.make_orthographic_map(central_longitude=200,central_latitude=30) >>> ipmag.plot_vgp(map_axis,vgp_lon=vgp_lon_list,vgp_lat=vgp_lat_list,color='red',markersize=40,zorder=20))
- pmagpy.ipmag.pmag_results_extract(res_file='pmag_results.txt', crit_file='', spec_file='', age_file='', latex=False, grade=False, WD='.')[source]#
Generate tab delimited output file(s) with result data. Save output files and return True if successful. Possible output files: Directions, Intensities, SiteNfo, Criteria,
Specimens
- Parameters:
res_file – name of pmag_results file (default is “pmag_results.txt”)
crit_file – name of criteria file (default is “pmag_criteria.txt”)
spec_file – name of specimen file (default is “pmag_specimens.txt”)
age_file – name of age file (default is “er_ages.txt”)
latex – boolean argument to output in LaTeX (default is False)
WD – path to directory that contains input files and takes output (default is current directory, ‘.’)
- pmagpy.ipmag.pole_comparison_H2019(lon_1, lat_1, k_1, r_1, lon_2, lat_2, k_2, r_2)[source]#
Calculate the Bhattacharyya Coefficient, Bayes error and the Kullback-Leibler divergence associated with the comparison of paleomagnetic poles following Heslop and Roberts (2019). The divergence parameter is asymmetric such that the pole that is the reference pole should be (lon_1, lat_1, k_1, r_1) and the pole of interest being compared to that reference pole should be (lon_2, lat_2, k_2, r_2).
- Parameters:
lon_1 – longitude of pole 1 (reference pole)
lat_1 – latitude of pole 1
k_1 – Fisher concentration parameter of pole 1
r_1 – resultant vector length of pole 1
lon_2 – longitude of pole 2 (pole of interest)
lat_2 – latitude of pole 2
k_2 – Fisher concentration parameter of pole 2
r_2 – resultant vector length of pole 2
- Returns:
bhattacharyya, Bhattacharyya coefficient
bayes, bayes error
kld, Kullback-Leibler divergence
Notes
This function utilizes code developed by D. Heslop dave-heslop74/kld dave-heslop74/bhattacharyya
- pmagpy.ipmag.polemap_magic(loc_file='locations.txt', dir_path='.', interactive=False, crd='', sym='ro', symsize=40, rsym='g^', rsymsize=40, fmt='pdf', res='c', proj='ortho', flip=False, anti=False, fancy=False, ell=False, ages=False, lat_0=90.0, lon_0=0.0, save_plots=True, contribution=None, image_records=False)[source]#
Use a MagIC format locations table to plot poles.
- Parameters:
loc_file – str, default “locations.txt”
dir_path – str, default “.” directory name to find loc_file in (if not included in loc_file)
interactive – bool, default False
True (if) – (this is best used on the command line only)
display (interactively plot and) – (this is best used on the command line only)
crd – str, default “”
[g (coordinate system)
t] (geographic, tilt_corrected)
sym – str, default “ro” symbol color and shape, default red circles (see matplotlib documentation for more options)
symsize – int, default 40 symbol size
rsym – str, default “g^” symbol for plotting reverse poles
rsymsize – int, default 40 symbol size for reverse poles
fmt – str, default “pdf” format for figures, [“svg”, “jpg”, “pdf”, “png”]
res – str, default “c” resolution [c, l, i, h] (crude, low, intermediate, high)
proj – str, default “ortho” ortho = orthographic lcc = lambert conformal moll = molweide merc = mercator
flip – bool, default False if True, flip reverse poles to normal antipode
anti – bool, default False if True, plot antipodes for each pole
fancy – bool, default False if True, plot topography (not yet implementedj)
ell – bool, default False if True, plot ellipses
ages – bool, default False if True, plot ages
lat_0 – float, default 90. eyeball latitude
lon_0 – float, default 0. eyeball longitude
save_plots – bool, default True if True, create and save all requested plots
image_records – generate and return a record for each image in a list of dicts which can be ingested by pmag.magic_write bool, default False
- Returns:
- if image_records == False
True or False indicating if conversion was successful, file name(s) written
- if image_records == True
True or False indicating if conversion was successful, output file name written, list of image recs
- pmagpy.ipmag.print_direction_mean(mean_dictionary)[source]#
Does a pretty job printing a Fisher mean and associated statistics for directional data.
- Parameters:
mean_dictionary – output dictionary of pmag.fisher_mean()
- Returns:
prints the mean and associated statistics
Examples
Generate a Fisher mean using
ipmag.fisher_mean()
and then print it nicely usingipmag.print_direction_mean()
>>> my_mean = ipmag.fisher_mean(di_block=[[140,21],[127,23],[142,19],[136,22]]) >>> ipmag.print_direction_mean(my_mean) Dec: 136.3 Inc: 21.3 Number of directions in mean (n): 4 Angular radius of 95% confidence (a_95): 7.3 Precision parameter (k) estimate: 159.7
- pmagpy.ipmag.print_kent_mean(mean_dictionary)[source]#
Does a pretty job printing a Kent mean and associated statistics.
- Parameters:
mean_dictionary – output dictionary of ipmag.kent_mean
- Returns:
prints the mean and associated statistics
Examples
Generate a Kent mean using
ipmag.kent_mean()
and then print it nicely usingipmag.print_kent_mean()
>>> my_di_block = [[183.2931831390693, 80.70169305002725, 1.0],[75.50744693411644, 79.57922789535208, 1.0], [162.32513875820177, 76.51741087479394, 1.0], [143.8749848879392, 85.79156599168213, 1.0], [177.12011881027854, 84.02007456929348, 1.0]]
>>> my_kent_mean = ipmag.kent_mean(di_block = my_di_block) >>> ipmag.print_kent_mean(my_kent_mean) Plon: 150.4 Plat: 83.3 Major axis lon: 31.4 Major axis lat: 3.2 Minor axis lon: 301.0 Minor axis lat: 5.8 Major axis angle of 95% ellipse (Zeta): 6.5 Minor axis angle of 95% ellipse (Eta): 2.8 Number of directions in mean (n): 5
- pmagpy.ipmag.print_pole_mean(mean_dictionary)[source]#
Does a pretty job printing a Fisher mean and associated statistics for mean paleomagnetic poles.
- Parameters:
mean_dictionary – output dictionary of pmag.fisher_mean()
- Returns:
prints the mean and associated statistics
Examples
Generate a Fisher mean using
ipmag.fisher_mean()
and then print it nicely usingipmag.print_pole_mean()
>>> my_mean = ipmag.fisher_mean(di_block=[[140,21],[127,23],[142,19],[136,22]]) >>> ipmag.print_pole_mean(my_mean) Plon: 136.3 Plat: 21.3 Number of directions in mean (n): 4 Angular radius of 95% confidence (A_95): 7.3 Precision parameter (k) estimate: 159.7
- pmagpy.ipmag.quick_hyst(dir_path='.', meas_file='measurements.txt', save_plots=True, interactive=False, fmt='png', specimen='', verbose=True, n_plots=10, contribution=None, image_records=False)[source]#
makes specimen plots of hysteresis data
- Parameters:
dir_path (str, default ".") – input directory
meas_file (str, default "measurements.txt") – name of MagIC measurement file
save_plots (bool, default True) – save figures
interactive (bool, default False) – if True, interactively plot and display (this is best used on the command line only)
fmt (str, default "svg") – format for figures, [“svg”, “jpg”, “pdf”, “png”]
specimen (str, default "") – specific specimen to plot
verbose (bool, default True) – if True, print more verbose output
image_records (bool, default False) – if True, return a list of created images
- Returns:
if image_records == False – Tuple : (True or False indicating if conversion was successful, output file name(s) written)
if image_records == True – Tuple : (True or False indicating if conversion was successful, output file name(s) written, list of images)
- pmagpy.ipmag.rand_correlation_prob(sec_var, delta1, delta2, alpha, trials=10000, print_result=False)[source]#
The function runs an algorithm from Bogue and Coe (1981; doi: 10.1029/JB086iB12p11883) for probabilistic correlation, evaluating the probability that the similarity between two paleomagnetic directions is due to random sampling of the ancient magnetic field. Original written in Python by S. Bogue, translated to PmagPy functionality by AFP.
Parameters: sec_var: kappa estimate of regional secular variation (probably 30 or 40) alpha: angle between paleomagnetic directions (or poles) delta1: distance of direction 1 from mean direction delta2: distance of direction 2 from mean direction trials: the number of simulations, default=10,000 print_result: the probability value printed as a sentence, default=False
- Returns:
- float
number indicating probability value
Example
Provide estimate of regional secular variation, angle between directions, distance of each direction from a mean direction (like GAD) to return probability of random field sampling, compare to RC / 11 comparison from Table 2 of the original publication (exact value may differ due to RNG):
>>> ipmag.rand_correlation_prob(40, 17.2, 20, 3.6) np.float64(0.0103)
- pmagpy.ipmag.reversal_test_MM1990(dec=None, inc=None, di_block=None, plot_CDF=False, plot_stereo=False, save=False, save_folder='.', fmt='svg')[source]#
Calculates Watson’s V statistic from input files through Monte Carlo simulation in order to test whether normal and reversed populations could have been drawn from a common mean. Also provides the critical angle between the two sample mean directions and the corresponding McFadden and McElhinny (1990) classification. This function is a wrapper around the ipmag.common_mean_watson() function with the first step of splitting the data into two polarities using the pmag.flip() function and flipping the reverse direction to their antipode.
- Parameters:
dec (list, optional): List of declinations. inc (list, optional): List of inclinations. di_block (list of lists, optional): Nested list of [dec,inc]. If provided, it
takes precedence over separate dec and inc lists.
plot_CDF (bool, optional): If True, plot the CDF accompanying the results. Defaults to False. plot_stereo (bool, optional): If True, plot stereonet with bidirectionally separated data. Defaults to False. save (bool, optional): If True, save the plots. Defaults to False. save_folder (str, optional): Directory path for saving plots. Defaults to current directory. fmt (str, optional): Format of saved figures. Defaults to ‘svg’.
- Returns:
0 indicates fail, 1 indicates pass. angle (float): Angle between the Fisher means of the two data sets. critical_angle (float): Critical angle for the test to pass. classification (str): MM1990 classification for a positive test.
- Return type:
result (bool)
Examples
Populations of roughly antipodal directions are developed here using
ipmag.fishrot
. These directions are combined into a single di_block given that the function determines the principal component and splits the data accordingly by polarity.>>> directions_n = ipmag.fishrot(k=20, n=30, dec=5, inc=-60) >>> directions_r = ipmag.fishrot(k=35, n=25, dec=182, inc=57) >>> directions = directions_n + directions_r >>> ipmag.reversal_test_MM1990(di_block=directions, plot_stereo = True)
Data can also be input to the function as separate lists of dec and inc. In this example, the di_block from above is split into lists of dec and inc which are then used in the function:
>>> direction_dec, direction_inc, direction_moment = ipmag.unpack_di_block(directions) >>> ipmag.reversal_test_MM1990(dec=direction_dec,inc=direction_inc, plot_stereo = True)
- pmagpy.ipmag.reversal_test_bootstrap(dec=None, inc=None, di_block=None, plot_stereo=False, color1='blue', color2='red', save=False, save_folder='.', fmt='svg', verbose=True)[source]#
Conduct a reversal test using bootstrap statistics (Tauxe, 2010) to determine whether two populations of directions could be from an antipodal common mean.
- Parameters:
dec – list of declinations
inc – list of inclinations
di_block – a nested list of [dec,inc] A di_block can be provided in which case it will be used instead of dec, inc lists.
plot_stereo – before plotting the CDFs, plot stereonet with the bidirectionally separated data (default is False)
save – boolean argument to save plots (default is False)
save_folder – directory where plots will be saved (default is current directory, ‘.’)
fmt – format of saved figures (default is ‘svg’)
- Returns:
A boolean where 0 is fail and 1 is pass is returned. Plots of the cumulative distribution of Cartesian components are shown an equal area plot if plot_stereo = True.
Examples
Populations of roughly antipodal directions are developed here using
ipmag.fishrot
. These directions are combined into a single di_block given that the function determines the principal component and splits the data accordingly by polarity.>>> directions_n = ipmag.fishrot(k=20, n=30, dec=5, inc=-60) >>> directions_r = ipmag.fishrot(k=35, n=25, dec=182, inc=57) >>> directions = directions_n + directions_r >>> ipmag.reversal_test_bootstrap(di_block=directions, plot_stereo = True)
Data can also be input to the function as separate lists of dec and inc. In this example, the di_block from above is split into lists of dec and inc which are then used in the function:
>>> direction_dec, direction_inc, direction_moment = ipmag.unpack_di_block(directions) >>> ipmag.reversal_test_bootstrap(dec=direction_dec,inc=direction_inc, plot_stereo = True)
- pmagpy.ipmag.reversal_test_bootstrap_H23(dec=None, inc=None, di_block=None, num_sims=10000, alpha=0.05, plot=True, save=False, save_folder='.', fmt='svg', verbose=True)[source]#
Bootstrap reversal test following Heslop et al. (2023).
This function calls common_mean_bootstrap_H23 with directional data that have been flipped, for a reversal test. The directional data can be provided either as separate declination and inclination arrays or as a di_block array.
- Parameters:
dec (array) – Array of declinations, only considered if di_block is None.
inc (array) – Array of inclinations, only considered if di_block is None.
di_block (array, optional) – Directional data as [dec, inc] for each sample. If provided, dec and inc are ignored.
num_sims (int, optional) – Number of bootstrap simulations. Default is 1000.
alpha (float, optional) – Significance level for hypothesis testing. Default is 0.05.
plot (bool, optional) – If True, produce a histogram plot of the test statistic. Default is True.
save (bool, optional) – If True, save the histogram plot. Default is False.
save_folder (str, optional) – Directory where the histogram plot will be saved. Default is the current directory.
fmt (str, optional) – File format for saving the histogram plot. Default is ‘svg’.
- Returns:
- Contains the following elements:
result (int): 0 if null hypothesis is rejected, 1 otherwise.
Lmin (float): The test statistic value.
Lmin_c (float): The critical test statistic value.
p (float): The p-value of the test.
- Return type:
tuple
- pmagpy.ipmag.sb_vgp_calc(dataframe, site_correction='yes', dec_tc='dec_tc', inc_tc='inc_tc')[source]#
This function calculates the angular dispersion of VGPs and corrects for within site dispersion (unless site_correction = ‘no’) to return a value S_b. The input data needs to be within a pandas Dataframe.
- Parameters:
dataframe (the name of the pandas.DataFrame containing the data)
columns (the data frame needs to contain these)
dataframe['site_lat'] (latitude of the site)
dataframe['site_lon'] (longitude of the site)
dataframe['k'] (fisher precision parameter for directions)
dataframe['vgp_lat'] (VGP latitude)
dataframe['vgp_lon'] (VGP longitude)
----- (----- the following default parameters can be changes by keyword argument)
dataframe['inc_tc'] (tilt-corrected inclination)
dataframe['dec_tc'] (tilt-corrected declination)
plot (default is 'no', will make a plot of poles if 'yes')
- pmagpy.ipmag.separate_directions(dec=None, inc=None, di_block=None)[source]#
Separates directional data into two modes based on the principal direction.
- Parameters:
dec (list, optional) – List of declinations. Defaults to None.
inc (list, optional) – List of inclinations. Defaults to None.
di_block (list of lists, optional) – Nested list of [dec,inc]. Can be provided instead of separate dec, inc lists. If provided, it takes precedence.
- Returns:
- Depending on input, either:
dec1, inc1, dec2, inc2: Lists of declinations and inclinations for the two modes (if separate dec, inc lists are provided)
polarity1, polarity2: Nested lists of [dec,inc] for the two modes (if di_block is provided)
- Return type:
tuple
- pmagpy.ipmag.shoot(lon, lat, azimuth, maxdist=None)[source]#
This function enables A95 error ellipses to be drawn around paleomagnetic poles in conjunction with equi (from: http://www.geophysique.be/2011/02/20/matplotlib-basemap-tutorial-09-drawing-circles/)
- pmagpy.ipmag.simul_correlation_prob(alpha, k1, k2, trials=10000, print_result=False)[source]#
The function runs an algorithm from Bogue and Coe (1981; doi: 10.1029/JB086iB12p11883) for probabilistic correlation, evaluating the probability that the similarity between two paleomagnetic directions is due to simultaneous sampling of the ancient magnetic field. Original written in Python by S. Bogue, translated to PmagPy functionality by AFP.
- Parameters:
alpha – angle between paleomagnetic directions (site means)
k1 (float) – kappa estimate for first direction
k2 (float) – kappa estimate for second direction
trials (int) – the number of simulations [default = 10,000]
print_result (boolean) – the probability value returned in a sentence [default = False]
- Returns:
- float
number indicating probability value
Example
Provide an angle and two precision parameter estimates to get the probability of simultaneity, compare to RC / 11 comparison from Table 2 of the original publication (exact value may differ due to RNG):
>>> ipmag.simul_correlation_prob(3.6, 391, 146) 0.8127
- pmagpy.ipmag.sites_extract(site_file='sites.txt', directions_file='directions.xls', intensity_file='intensity.xls', info_file='site_info.xls', output_dir_path='.', input_dir_path='', latex=False)[source]#
Extracts directional and/or intensity data from a MagIC 3.0 format sites.txt file. Default output format is an Excel file. Optional latex format longtable file which can be uploaded to Overleaf or typeset with latex on your own computer.
- Parameters:
site_file (str) – input file name
directions_file (str) – output file name for directional data
intensity_file (str) – output file name for intensity data
site_info (str) – output file name for site information (lat, lon, location, age….)
output_dir_path (str) – path for output files
input_dir_path (str) – path for intput file if different from output_dir_path (default is same)
latex (boolean) – if True, output file should be latex formatted table with a .tex ending
Return – [True,False], error type : True if successful
Effects – writes Excel or LaTeX formatted tables for use in publications
- pmagpy.ipmag.smooth(x, window_len, window='bartlett')[source]#
Smooth the data using a sliding window with requested size - meant to be used with the ipmag function curie(). This method is based on the convolution of a scaled window with the signal. The signal is prepared by padding the beginning and the end of the signal with average of the first (last) ten values of the signal, to evoid jumps at the beginning/end. Output is an array of the smoothed signal.
Required Parameters#
x : the input signal, equally spaced! window_len : the dimension of the smoothing window
Optional Parameters (defaults are used if not specified)#
- windowtype of window from numpy library [‘flat’,’hanning’,’hamming’,’bartlett’,’blackman’]
(default is Bartlett) -flat window will produce a moving average smoothing. -Bartlett window is very similar to triangular window,
but always ends with zeros at points 1 and n.
-hanning,hamming,blackman are used for smoothing the Fourier transform
- pmagpy.ipmag.specimens_extract(spec_file='specimens.txt', output_file='specimens.xls', landscape=False, longtable=False, output_dir_path='.', input_dir_path='', latex=False)[source]#
Extracts specimen results from a MagIC 3.0 format specimens.txt file. Default output format is an Excel file. typeset with latex on your own computer.
- Parameters:
spec_file (str, default "specimens.txt") – input file name
output_file (str, default "specimens.xls") – output file name
landscape (boolean, default False) – if True output latex landscape table
longtable (boolean) – if True output latex longtable
output_dir_path (str, default ".") – output file directory
input_dir_path (str, default "") – path for intput file if different from output_dir_path (default is same)
latex (boolean, default False) – if True, output file should be latex formatted table with a .tex ending
Return – [True,False], data table error type : True if successful
Effects – writes xls or latex formatted tables for use in publications
- pmagpy.ipmag.specimens_results_magic(infile='pmag_specimens.txt', measfile='magic_measurements.txt', sampfile='er_samples.txt', sitefile='er_sites.txt', agefile='er_ages.txt', specout='er_specimens.txt', sampout='pmag_samples.txt', siteout='pmag_sites.txt', resout='pmag_results.txt', critout='pmag_criteria.txt', instout='magic_instruments.txt', plotsites=False, fmt='svg', dir_path='.', cors=[], priorities=['DA-AC-ARM', 'DA-AC-TRM'], coord='g', user='', vgps_level='site', do_site_intensity=True, DefaultAge=['none'], avg_directions_by_sample=False, avg_intensities_by_sample=False, avg_all_components=False, avg_by_polarity=False, skip_directions=False, skip_intensities=False, use_sample_latitude=False, use_paleolatitude=False, use_criteria='default')[source]#
Writes magic_instruments, er_specimens, pmag_samples, pmag_sites, pmag_criteria, and pmag_results. The data used to write this is obtained by reading a pmag_speciemns, a magic_measurements, a er_samples, a er_sites, a er_ages. @param -> infile: path from the WD to the pmag speciemns table @param -> measfile: path from the WD to the magic measurement file @param -> sampfile: path from the WD to the er sample file @param -> sitefile: path from the WD to the er sites data file @param -> agefile: path from the WD to the er ages data file @param -> specout: path from the WD to the place to write the er specimens data file @param -> sampout: path from the WD to the place to write the pmag samples data file @param -> siteout: path from the WD to the place to write the pmag sites data file @param -> resout: path from the WD to the place to write the pmag results data file @param -> critout: path from the WD to the place to write the pmag criteria file @param -> instout: path from th WD to the place to write the magic instruments file @param -> documentation incomplete if you know more about the purpose of the parameters in this function and it’s side effects please extend and complete this string
- pmagpy.ipmag.squish(incs, f)[source]#
This function applies an flattening factor (f) to inclination data (incs) and returns ‘squished’ values.
- Parameters:
incs (list of inclination values or a single value)
f (flattening factor) – A value between 0.0 and 1.0 where 1.0 is no flattening and 0.0 is complete flattening.
- Returns:
incs_squished
- Return type:
List of flattened directions (in degrees)
Examples
Take a list of inclinations and flatten (i.e. “squish”) them:
>>> inclinations = [43,47,41] >>> ipmag.squish(inclinations,0.4) [20.455818908027187, 23.216791019112204, 19.173314360172309]
- pmagpy.ipmag.thellier_magic(meas_file='measurements.txt', dir_path='.', input_dir_path='', spec='', n_specs=5, save_plots=True, fmt='svg', interactive=False, contribution=None, image_records=False)[source]#
thellier_magic plots arai and other useful plots for Thellier-type experimental data
- Parameters:
meas_file (str) – input measurement file, default “measurements.txt”
dir_path (str) –
output directory, default “.” Note: if using Windows, all figures will be saved to working directly
not dir_path
input_dir_path (str) – input file directory IF different from dir_path, default “”
spec (str) – default “”, specimen to plot
n_specs (int) – number of specimens to plot, default 5 if you want to make all possible plots, specify “all”
save_plots (bool, default True) – True, create and save all requested plots
fmt (str) – format of saved figures (default is ‘svg’)
interactive (bool, default False) – interactively plot and display for each specimen (this is best used on the command line only)
contribution (cb.Contribution, default None) – if provided, use Contribution object instead of reading in data from files
image_records (generate and return a record for each image in a list of dicts) – which can be ingested by pmag.magic_write bool, default False
- Returns:
status (True or False)
saved (list of figures saved)
if image_records == True – image_recs : list of image records
- pmagpy.ipmag.tk03(n=100, dec=0, lat=0, rev='no', G1=-18000.0, G2=0, G3=0, B_threshold=0)[source]#
Generates vectors drawn from the TK03.gad model of secular variation (Tauxe and Kent, 2004) at given latitude and rotated about a vertical axis by the given declination. Returns a nested list of of [dec,inc,intensity].
- Parameters:
n (number of vectors to determine (default is 100))
dec (mean declination of data set (default is 0))
lat (latitude at which secular variation is simulated (default is 0))
rev (if reversals are to be included this should be 'yes' (default is 'no'))
G1 (specify average g_1^0 fraction (default is -18e3 in nT, minimum = 1))
G2 (specify average g_2^0 fraction (default is 0))
G3 (specify average g_3^0 fraction (default is 0))
B_threshold (return vectors with B>B_threshold (in nT) (default is 0 which) – returns all vectors)
- Returns:
tk_03_output
- Return type:
a nested list of declination, inclination, and intensity (in nT)
Examples
>>> ipmag.tk03(n=5, dec=0, lat=0) [[14.752502674158681, -36.189370642603834, 16584.848620957589], [9.2859465437113311, -10.064247301056071, 17383.950391596223], [2.4278460589582913, 4.8079990844938019, 18243.679003572055], [352.93759572283585, 0.086693343935840397, 18524.551174838372], [352.48366219759953, 11.579098286352332, 24928.412830772766]]
- pmagpy.ipmag.transform_to_geographic(this_spec_meas_df, samp_df, samp, coord='0')[source]#
Transform decs/incs to geographic coordinates. Calls pmag.dogeo_V for the heavy lifting
- Parameters:
this_spec_meas_df (pandas dataframe of measurements for a single specimen)
samp_df (pandas dataframe of samples)
samp (samp name)
- Returns:
this_spec_meas_df
- Return type:
measurements dataframe with transformed coordinates
- pmagpy.ipmag.unpack_di_block(di_block)[source]#
This function unpacks a nested list of [dec,inc,mag_moment] into a list of declination values, a list of inclination values and a list of magnetic moment values. Mag_moment values are optional, while dec and inc values are required.
- Parameters:
di_block (nested list of declination, inclination lists)
- Returns:
dec (list of declinations)
inc (list of inclinations)
mag_moment (list of magnetic moment (if present in di_block))
Example
The di_block nested lists of lists can be unpacked using the function
>>> directions = [[180.3, 12.1, 1.0], [179.2, 13.7, 1.0], [177.2, 11.9, 1.0]] >>> ipmag.unpack_di_block(directions) ([180.3, 179.2, 177.2], [12.1, 13.7, 11.9], [1.0, 1.0, 1.0])
These unpacked values can be assigned to variables:
>>> dec, inc, moment = ipmag.unpack_di_block(directions)
- pmagpy.ipmag.unpack_magic(infile=None, dir_path='.', input_dir_path='', overwrite=False, print_progress=True, data_model=3.0, separate_locs=False, txt='', excel=False)[source]#
Wrapper function for ipmag.download_magic, to handle the unpacking of a MagIC contribution.
This function takes in a text file, typically downloaded from the MagIC database, and then unpacks it into MagIC-formatted files. The name emphasizes the “unpacking” nature of the operation over the “downloading” aspect.
- Parameters:
infile – str, optional Name of the MagIC-format file to unpack.
dir_path – str, optional Directory path for output. Default is the current directory.
input_dir_path – str, optional Path to the input file if different from dir_path. Default is dir_path.
overwrite – bool, optional Whether to overwrite files in the current directory. Default is False.
print_progress – bool, optional Whether to print progress messages. Default is True.
data_model – float, optional Specifies the MagIC data model version, either 2.5 or 3. Default is 3.
separate_locs – bool, optional If True, create separate directories for each location. Default is False.
txt – str, optional Alternative to providing an infile, you can provide the file contents as a string. Useful for directly downloading a MagIC file from EarthRef. Default is an empty string.
excel – bool, optional If True, the input file is treated as an Excel spreadsheet. Default is False.
- Returns:
- bool
True if the unpacking operation is successful. False otherwise.
- pmagpy.ipmag.unsquish(incs, f)[source]#
This function applies a flattening factor (f) to unflatten inclination data (incs) and returns ‘unsquished’ values.
- Parameters:
incs (list of inclination values or a single value)
f (flattening factor) – A value greater than 0.0 and less than or equal to 1.0 that is used to unflatten inclination values. 1.0 implies no flatting and will result in no change.
- Returns:
incs_unsquished
- Return type:
List of unflattened inclinations (in degrees)
Examples
Take a list of inclinations, flatten them using
ipmag.squish
and then useipmag.squish
and the flattening factor to unflatten (i.e. “unsquish”) them:>>> inclinations = [43,47,41] >>> squished_incs = ipmag.squish(inclinations,0.4) >>> ipmag.unsquish(squished_incs,0.4) [43.0, 47.0, 41.0]
- pmagpy.ipmag.upload_magic(concat=False, dir_path='.', input_dir_path='.', validate=True, verbose=True)[source]#
Finds all magic files in a given directory, and compiles them into an upload.txt file which can be uploaded into the MagIC database. If username/password set, then data will be uploaded to private workspace, otherwise validation will be done on this computer.
- Parameters:
concat (boolean where True means do concatenate to upload.txt file in dir_path,) – False means write a new file (default is False)
dir_path (string for output directory (default "."))
input_dir_path (str, default ".")
validate (boolean) – validate upload file on MagIC’s public endpoint
verbose (boolean) – if True print progress and validation results
- Returns:
tuple of either (True/False or (False, error_message, validation dictionary val_response[‘validation’]))
if there was a problem creating/validating the upload file
or ((filename, ‘’, None) if the file creation was fully successful.)
- pmagpy.ipmag.upload_magic2(concat=0, dir_path='.', data_model=None)[source]#
Finds all magic files in a given directory, and compiles them into an upload.txt file which can be uploaded into the MagIC database. Returns a tuple of either: (False, error_message, errors) if there was a problem creating/validating the upload file or: (filename, ‘’, None) if the upload was fully successful.
- pmagpy.ipmag.upload_to_private_contribution(contribution_id, upload_file, username='', password='')[source]#
Upload to a private contribution on earthref.org/MagIC.
- Parameters:
contribution_id (int) – ID of MagIC contribution to delete
upload_file (str) – file to upload (complete path)
username (str) – personal username for MagIC
password (str) – password for username
- Returns:
response –
- response.status_code: bool
True : successful creation of private workspace
- response[‘url’]str
URL of request
response[‘method’]=’PUT’ response[‘errors’] : str
if unsuccessful, error message
- Return type:
API requests.models.Response
- pmagpy.ipmag.validate_magic(top_dir, doi=False, private_key=False, contribution_id=False)[source]#
download and validate a magic contribution :param top_dir: name of project :type top_dir: str :param doi: DOI of paper to download :type doi: str :param contribution_id: id of contribution :type contribution_id: str :param private_key: private key of contribution in private workspace :type private_key: str
- pmagpy.ipmag.validate_private_contribution(contribution_id, username='', password='', verbose=True)[source]#
validate private contribution in MagIC
- Parameters:
contribution_id (int) – ID of MagIC contribution to delete
username (str) – personal username for MagIC
password (str) – password for username
verbose (bool) – if True, print error messages
- Returns:
response –
- response.status_code: bool
True : successful validation of private workspace
- response[‘url’]str
URL of request
response[‘results’] : dictionary of validation results response[‘method’]=’POST’ response[‘errors’] : str
if unsuccessful, error message
- Return type:
API requests.models.Response
- pmagpy.ipmag.validate_with_public_endpoint(contribution_file, verbose=False)[source]#
validate contribution to MagIC using public endpoint
- Parameters:
contribution_file (str) – file to validate
verbose (bool) – if True, print error messages
- Returns:
response –
- response.status_code: bool
True : successful validation of private workspace
response[‘errors’] : None or ‘trouble validating’ response[‘validation_results’] : dictionary of validation errors response[‘warnings’] : list of warnings
- Return type:
API requests.models.Response
- pmagpy.ipmag.vgp_calc(dataframe, tilt_correction='yes', site_lon='site_lon', site_lat='site_lat', dec_is='dec_is', inc_is='inc_is', dec_tc='dec_tc', inc_tc='inc_tc', recalc_label=False)[source]#
This function calculates paleomagnetic poles using directional data and site location data within a pandas.DataFrame. The function adds the columns ‘paleolatitude’, ‘vgp_lat’, ‘vgp_lon’, ‘vgp_lat_rev’, and ‘vgp_lon_rev’ to the dataframe. The ‘_rev’ columns allow for subsequent choice as to which polarity will be used for the VGPs.
- Parameters:
dataframe (the name of the pandas.DataFrame containing the data)
tilt-correction ('yes' is the default and uses tilt-corrected data (dec_tc, inc_tc), 'no' uses data that is not tilt-corrected and is in geographic coordinates)
dataframe['site_lat'] (the name of the Dataframe column containing the latitude of the site)
dataframe['site_lon'] (the name of the Dataframe column containing the longitude of the site)
dataframe['inc_tc'] (the name of the Dataframe column containing the tilt-corrected inclination (used by default tilt-correction='yes'))
dataframe['dec_tc'] (the name of the Dataframe column containing the tilt-corrected declination (used by default tilt-correction='yes'))
dataframe['inc_is'] (the name of the Dataframe column containing the insitu inclination (used when tilt-correction='no'))
dataframe['dec_is'] (the name of the Dataframe column containing the insitu declination (used when tilt-correction='no'))
- Returns:
dataframe[‘paleolatitude’]
dataframe[‘colatitude’]
dataframe[‘vgp_lat’]
dataframe[‘vgp_lon’]
dataframe[‘vgp_lat_rev’]
dataframe[‘vgp_lon_rev’]
- pmagpy.ipmag.vgpmap_magic(dir_path='.', results_file='sites.txt', crd='', sym='ro', size=8, rsym='g^', rsize=8, fmt='pdf', res='c', proj='ortho', flip=False, anti=False, fancy=False, ell=False, ages=False, lat_0=0, lon_0=0, save_plots=True, interactive=False, contribution=None, image_records=False)[source]#
makes a map of vgps and a95/dp,dm for site means in a sites table
- Parameters:
dir_path (str, default ".") – input directory path
results_file (str, default "sites.txt") – name of MagIC format sites file
crd (str, default "") – coordinate system [g, t] (geographic, tilt_corrected)
sym (str, default "ro") – symbol color and shape, default red circles (see matplotlib documentation for more color/shape options)
size (int, default 8) – symbol size
rsym (str, default "g^") – symbol for plotting reverse poles (see matplotlib documentation for more color/shape options)
rsize (int, default 8) – symbol size for reverse poles
fmt (str, default "pdf") – format for figures, [“svg”, “jpg”, “pdf”, “png”]
res (str, default "c") – resolution [c, l, i, h] (crude, low, intermediate, high)
proj (str, default "ortho") – ortho = orthographic lcc = lambert conformal moll = molweide merc = mercator
flip (bool, default False) – if True, flip reverse poles to normal antipode
anti (bool, default False) – if True, plot antipodes for each pole
fancy (bool, default False) – if True, plot topography (not yet implemented)
ell (bool, default False) – if True, plot ellipses
ages (bool, default False) – if True, plot ages
lat_0 (float, default 0.) – eyeball latitude
lon_0 (float, default 0.) – eyeball longitude
save_plots (bool, default True) – if True, create and save all requested plots
interactive (bool, default False) –
- if True, interactively plot and display
(this is best used on the command line only)
image_records (bool, default False) – if True, return a list of created images
- Returns:
if image_records == False – type - Tuple : (True or False indicating if conversion was successful, file name(s) written)
if image_records == True – Tuple : (True or False indicating if conversion was successful, output file name written, list of image recs)
- pmagpy.ipmag.zeq(path_to_file='.', file='', data='', units='U', calculation_type='DE-BFL', save=False, save_folder='.', fmt='svg', begin_pca='', end_pca='', angle=0, make_plots=True, show_data=True)[source]#
- NAME
zeq.py
- DESCRIPTION
- plots demagnetization data for a single specimen:
The solid (open) symbols in the Zijderveld diagram are X,Y (X,Z) pairs. The demagnetization diagram plots the
fractional remanence remaining after each step. The green line is the fraction of the total remaence removed between each step. If the principle direction is desired, specify begin_pca and end_pca steps as bounds for calculation.
-The equal area projection has the X direction (usually North in geographic coordinates) to the top. The red line is the X axis of the Zijderveld diagram. Solid symbols are lower hemisphere.
red dots and blue line is the remanence remaining after each step. The green line is the partial TRM removed in each interval
- INPUT FORMAT
reads from file_name or takes a Pandas DataFrame data with specimen treatment intensity declination inclination as columns
- Keywords:
- file= FILE a space or tab delimited file with
specimen treatment declination inclination intensity
units= [mT,C] specify units of mT OR C, default is unscaled save=[True,False] save figure and quit, default is False fmt [svg,jpg,png,pdf] set figure format [default is svg] begin_pca [step number] treatment step for beginning of PCA calculation, default end_pca [step number] treatment step for end of PCA calculation, last step is default calculation_type [DE-BFL,DE-BFP,DE-FM] Calculation Type: best-fit line, plane or fisher mean; line is default angle=[0-360]: angle to subtract from declination to rotate in horizontal plane, default is 0
- pmagpy.ipmag.zeq_magic(meas_file='measurements.txt', spec_file='', crd='s', dir_path='.', input_dir_path='', angle=0, n_plots=5, save_plots=True, fmt='svg', interactive=False, specimen='', samp_file='samples.txt', contribution=None, fignum=1, image_records=False)[source]#
eeq_magic makes zijderveld and equal area plots for magic formatted measurements files. :param meas_file: input measurement file :type meas_file: str :param spec_file: input specimen interpretation file :type spec_file: str :param samp_file: input sample orientations file :type samp_file: str :param crd: coordinate system [s,g,t] for specimen, geographic, tilt corrected
g,t options require a sample file with specimen and bedding orientation
- Parameters:
dir_path (str) – output directory for plots, default “.”
input_dir_path (str) – input directory, if different from dir_path, default “”
angle (float) – angle of X direction with respect to specimen X
n_plots (int, default 5) – maximum number of plots to make if you want to make all possible plots, specify “all”
save_plots (bool, default True) – if True, create and save all requested plots
fmt (str, default "svg") – format for figures, [svg, jpg, pdf, png]
interactive (bool, default False) – interactively plot and display for each specimen (this is best used on the command line only)
specimen (str, default "") – specimen name to plot
samp_file (str, default 'samples.txt') – name of samples file
contribution (cb.Contribution, default None) – if provided, use Contribution object instead of reading in data from files
fignum (matplotlib figure number)
image_records (generate and return a record for each image in a list of dicts) – which can be ingested by pmag.magic_write bool, default False
- Returns:
if image_records == False – Tuple : (True or False indicating if conversion was successful, output file name written)
if image_records == True – Tuple : (True or False indicating if conversion was successful, output file name written, list of image recs)
pmagpy.pmagplotlib#
- pmagpy.pmagplotlib.add_borders(Figs, titles, border_color='#000000', text_color='#800080', con_id='')[source]#
Formatting for generating plots on the server Default border color: black Default text color: purple
- pmagpy.pmagplotlib.delticks(fig)[source]#
deletes half the x-axis tick marks
- Parameters:
fig (matplotlib figure number)
- pmagpy.pmagplotlib.draw_figs(FIGS)[source]#
Can only be used if matplotlib backend is set to TKAgg Does not play well with wxPython :param FIGS: :type FIGS: dictionary of figure names as keys and numbers as values
- pmagpy.pmagplotlib.gaussfunc(y, ybar, sigma)[source]#
cumulative normal distribution function of the variable y with mean ybar,standard deviation sigma uses expression 7.1.26 from Abramowitz & Stegun accuracy better than 1.5e-7 absolute :param y: :type y: input variable :param ybar: :type ybar: mean :param sigma: :type sigma: standard deviation
- pmagpy.pmagplotlib.k_s(X)[source]#
Kolmorgorov-Smirnov statistic. Finds the probability that the data are distributed as func - used method of Numerical Recipes (Press et al., 1986)
- pmagpy.pmagplotlib.label_tiepoints(ax, x, tiepoints, levels, color='black', lines=False)[source]#
Puts on labels for tiepoints in an age table on a stratigraphic plot.
- Parameters:
ax (obj) – axis on which to plot the labels
x (float or integer) – x value for the tiepoint labels
levels (float) – stratigraphic positions of the tiepoints
lines (bool) – put on horizontal lines at the tiepoint heights
- Returns:
ax – axis object
- Return type:
obj
- pmagpy.pmagplotlib.msp_magic(spec_df, axa='', axb='', site='site', labels=['a)', 'b)'], save_plots=False, fmt='pdf')[source]#
makes plots and calculations for MSP method of Dekkers & Boehnel (2006) (DB) and Fabian and Leonhardt (2010) method (DSC) of multi-specimen paleointensity technique. NB: this code requires seaborn and scipy to be installed
Parameters:#
- spec_dfpandas dataframe
data frame with MagIC measurement formatted data for one MSP experiment. measurements must have these MagIC method codes: Mo (NRM step): must contain ‘LT-NO’ M1 (pTRM at T || NRM): must contain ‘LT-NRM-PAR’ and not ‘LT-PTRM-I’ M2 (pTRM NRM: must contain ‘LT-NRM-APAR’ M3 (heat to T, cool in lab field): must contain ‘LT-T-Z-NRM-PAR’ M4 (repeat of M1): must contain ‘LT-PTRM-I’ lab field must be in ‘treat_dc_field’
axa : matplotlib figure subplot for DB plot, default is to create and return. axb : matplotlib figure subplot for DSC plot, default is to create and return. site : name of group of specimens for y-axis label, default is generic ‘site’ labels : plot labels as specified. save_plots : bool, default False
if True, creat and save plot
- fmtstr
format of saved figure (default is ‘pdf’)
- returns:
B (in uT) standard error of slope axa, axb
- pmagpy.pmagplotlib.plot_arai(fignum, indata, s, units)[source]#
makes Arai plots for Thellier-Thellier type experiments
- Parameters:
fignum (figure number of matplotlib plot object)
indata (nested list of data for Arai plots:) – the araiblock of data prepared by pmag.sortarai()
s (specimen name)
units ([K, J, ""] (kelvin, joules, unknown))
Effects
_______
plot (makes the Arai)
- pmagpy.pmagplotlib.plot_arai_zij(ZED, araiblock, zijdblock, s, units)[source]#
calls the four plotting programs for Thellier-Thellier experiments
- Parameters:
ZED (dictionary with plotting figure keys:) –
deremag : figure for de (re) magnezation plots arai : figure for the Arai diagram eqarea : equal area projection of data, color coded by
red circles: ZI steps blue squares: IZ steps yellow triangles : pTRM steps
zijd : Zijderveld diagram color coded by ZI, IZ steps deremag : demagnetization and remagnetization versus temperature
araiblock (nested list of required data from Arai plots)
zijdblock (nested list of required data for Zijderveld plots)
s (specimen name)
units (units for the arai and zijderveld plots)
Effects
________
calling (Makes four plots from the data by)
plot_arai (Arai plots)
plot_teq (equal area projection for Thellier data)
plotZ (Zijderveld diagram)
plot_np (de (re) magnetization diagram)
- pmagpy.pmagplotlib.plot_b(Figs, araiblock, zijdblock, pars)[source]#
deprecated (used in thellier_magic/microwave_magic)
- pmagpy.pmagplotlib.plot_bcr(fignum, Bcr1, Bcr2)[source]#
function to plot two estimates of Bcr against each other
- pmagpy.pmagplotlib.plot_cdf(fignum, data, xlab, sym, title, **kwargs)[source]#
Makes a plot of the cumulative distribution function. :param fignum: :type fignum: matplotlib figure number :param data: :type data: list of data to be plotted - doesn’t need to be sorted :param sym: :type sym: matplotlib symbol for plotting, e.g., ‘r–’ for a red dashed line :param **kwargs: :type **kwargs: optional dictionary with {‘color’: color, ‘linewidth’:linewidth, ‘fontsize’:fontsize for axes labels}
- Returns:
x (sorted list of data)
y (fraction of cdf)
- pmagpy.pmagplotlib.plot_circ(fignum, pole, ang, col)[source]#
function to put a small circle on an equal area projection plot, fig,fignum :param fignum: :type fignum: matplotlib figure number :param pole: :type pole: dec,inc of center of circle :param ang: :type ang: angle of circle :param col:
- pmagpy.pmagplotlib.plot_conf(fignum, s, datablock, pars, new)[source]#
plots directions and confidence ellipses
- pmagpy.pmagplotlib.plot_d_delta_m(fignum, Bdm, DdeltaM, s)[source]#
function to plot d (Delta M)/dB curves
- Parameters:
fignum (matplotlib figure number)
Bdm (change in field)
M (Ddelta)
s (specimen name)
- pmagpy.pmagplotlib.plot_day(fignum, BcrBc, S, sym, **kwargs)[source]#
function to plot Day plots
- Parameters:
fignum (matplotlib figure number)
BcrBc (list or array ratio of coercivity of remenance to coercivity)
S (list or array ratio of saturation remanence to saturation magnetization (squareness))
sym (matplotlib symbol (e.g., 'rs' for red squares))
**kwargs (dictionary with {'names':[list of names for symbols]})
- pmagpy.pmagplotlib.plot_delta_m(fignum, B, DM, Bcr, s)[source]#
function to plot Delta M curves
- Parameters:
fignum (matplotlib figure number)
B (array of field values)
DM (array of difference between top and bottom curves in hysteresis loop)
Bcr (coercivity of remanence)
s (specimen name)
- pmagpy.pmagplotlib.plot_dir(ZED, pars, datablock, angle)[source]#
function to put the great circle on the equal area projection and plot start and end points of calculation
DEPRECATED (used in zeq_magic)
- pmagpy.pmagplotlib.plot_ell(fignum, pars, col='k', lower=True, plot=True)[source]#
function to calculate/plot points on an ellipse about Pdec,Pdip with angle beta,gamma :param fignum: :type fignum: matplotlib figure number :param pars: where P is direction, Bdec,Binc are beta direction, and Gdec,Ginc are gamma direction :type pars: list of [Pdec, Pinc, beta, Bdec, Binc, gamma, Gdec, Ginc ] :param col: :type col: color for ellipse (default is black ‘k’) :param lower: :type lower: boolean, if True, lower hemisphere projection :param plot: :type plot: boolean, if False, return the points, if True, make the plot
- pmagpy.pmagplotlib.plot_eq(fignum, DIblock, s)[source]#
plots directions on eqarea projection :param fignum: :type fignum: matplotlib figure number :param DIblock: :type DIblock: nested list of dec/inc pairs :param s: :type s: specimen name
- pmagpy.pmagplotlib.plot_eq_cont(fignum, DIblock, color_map='coolwarm')[source]#
plots dec inc block as a color contour :param Input: fignum : figure number
DIblock : nested pairs of [Declination, Inclination] color_map : matplotlib color map [default is coolwarm]
- Parameters:
Output – figure
- pmagpy.pmagplotlib.plot_eq_sym(fignum, DIblock, s, sym)[source]#
plots directions with specified symbol :param fignum: :type fignum: matplotlib figure number :param DIblock: :type DIblock: nested list of dec/inc pairs :param s: :type s: specimen name :param sym: :type sym: matplotlib symbol (e.g., ‘bo’ for blue circle)
- pmagpy.pmagplotlib.plot_evec(fignum, Vs, symsize, title)[source]#
plots eigenvector directions of S vectors
- Parameters:
fignum (matplotlib figure number)
Vs (nested list of eigenvectors)
symsize (size in pts for symbol)
title (title for plot)
- pmagpy.pmagplotlib.plot_hdd(HDD, B, M, s)[source]#
Function to make hysteresis, deltaM and DdeltaM plots Parameters: _______________ Input
- HDDdictionary with figure numbers for the keys:
‘hyst’ : hysteresis plot normalized to maximum value ‘deltaM’ : Delta M plot ‘DdeltaM’ : differential of Delta M plot
B : list of field values in tesla M : list of magnetizations in arbitrary units s : specimen name string
- Ouput
- hparsdictionary of hysteresis parameters with keys:
‘hysteresis_xhf’, ‘hysteresis_ms_moment’, ‘hysteresis_mr_moment’, ‘hysteresis_bc’
- pmagpy.pmagplotlib.plot_hpars(HDD, hpars, sym)[source]#
function to plot hysteresis parameters deprecated (used in hysteresis_magic)
- pmagpy.pmagplotlib.plot_hs(fignum, Ys, c, ls)[source]#
plots horizontal lines at Ys values
- Parameters:
fignum (matplotlib figure number)
Ys (list of Y values for lines)
c (color for lines)
ls (linestyle for lines)
- pmagpy.pmagplotlib.plot_hys(fignum, B, M, s)[source]#
function to plot hysteresis data Parameters: _____________________ Input :
fignum : matplotlib figure number B : list of field values (in tesla) M : list of magnetizations
- Output :
- hparsdictionary of hysteresis parameters
keys: [‘hysteresis_xhf’, ‘hysteresis_ms_moment’, ‘hysteresis_mr_moment’, ‘hysteresis_bc’]
deltaM : list of differences between down and upgoing loops Bdm : field values
- pmagpy.pmagplotlib.plot_imag(fignum, Bimag, Mimag, s)[source]#
function to plot d (Delta M)/dB curves
- pmagpy.pmagplotlib.plot_init(fignum, w, h)[source]#
initializes plot number fignum with width w and height h :param fignum: :type fignum: matplotlib figure number :param w: :type w: width :param h: :type h: height
- pmagpy.pmagplotlib.plot_irm(fignum, B, M, title)[source]#
function to plot IRM backfield curves
- Parameters:
fignum (matplotlib figure number)
B (list or array of field values)
M (list or array of magnetizations)
title (string title for plot)
- pmagpy.pmagplotlib.plot_lnp(fignum, s, datablock, fpars, direction_type_key)[source]#
plots lines and planes on a great circle with alpha 95 and mean
- Parameters:
fignum (number of plt.figure() object)
s (str) – name of site for title
datablock (nested list of dictionaries with keys in 3.0 or 2.5 format) – 3.0 keys: dir_dec, dir_inc, dir_tilt_correction = [-1,0,100], method_codes =[‘DE-BFP’,’DE-BFL’] 2.5 keys: dec, inc, tilt_correction = [-1,0,100],direction_type_key =[‘p’,’l’]
fpars (Fisher parameters calculated by, e.g., pmag.dolnp() or pmag.dolnp3_0())
direction_type_key (key for dictionary direction_type ('specimen_direction_type'))
Effects
_______
figure (plots the site level)
- pmagpy.pmagplotlib.plot_ltc(LTC_CM, LTC_CT, LTC_WM, LTC_WT, e)[source]#
function to plot low temperature cycling experiments
- pmagpy.pmagplotlib.plot_mag(fignum, datablock, s, num, units, norm)[source]#
plots magnetization against (de)magnetizing temperature or field
- Parameters:
fignum (matplotlib figure number for plotting)
datablock (nested list of [step, 0, 0, magnetization, 1,quality])
s (string for title)
num (matplotlib figure number, can set to 1)
units ([T,K,U] for tesla, kelvin or arbitrary)
norm ([True,False] if True, normalize)
Effects
______ – plots figure
- pmagpy.pmagplotlib.plot_mag_map(fignum, element, lons, lats, element_type, cmap='coolwarm', lon_0=0, date='', contours=False, proj='PlateCarree', min=False, max=False)[source]#
makes a color contour map of geomagnetic field element
- Parameters:
fignum (matplotlib figure number)
element (plots a color contour map with the desired field)
lons (longitude array from pmag.do_mag_map for plotting)
lats (latitude array from pmag.do_mag_map for plotting)
element_type ([B,Br,I,D] geomagnetic element type) – B : field intensity Br : radial field intensity I : inclinations D : declinations
Optional
_________
contours (plot the contour lines on top of the heat map if True)
proj (cartopy projection ['PlateCarree','Mollweide']) – NB: The Mollweide projection can only be reliably with cartopy=0.17.0; otherwise use lon_0=0. Also, for declinations, PlateCarree is recommended.
cmap (matplotlib color map - see https://matplotlib.org/examples/color/colormaps_reference.html for options)
lon_0 (central longitude of the Mollweide projection)
date (date used for field evaluation,) – if custom ghfile was used, supply filename
min (int) – minimum value for color contour on intensity map : default is minimum value - useful for making many maps with same scale
max (int) – maximum value for color contour on intensity map : default is maximum value - useful for making many maps with same scale
Effects
______________
element
- pmagpy.pmagplotlib.plot_map(fignum, lats, lons, Opts)[source]#
makes a cartopy map with lats/lons Requires installation of cartopy
Parameters:#
fignum : matplotlib figure number lats : array or list of latitudes lons : array or list of longitudes Opts : dictionary of plotting options:
- Opts.keys=
- projprojection [supported cartopy projections:
pc = Plate Carree aea = Albers Equal Area aeqd = Azimuthal Equidistant lcc = Lambert Conformal lcyl = Lambert Cylindrical merc = Mercator mill = Miller Cylindrical moll = Mollweide [default] ortho = Orthographic robin = Robinson sinu = Sinusoidal stere = Stereographic tmerc = Transverse Mercator utm = UTM [set zone and south keys in Opts] laea = Lambert Azimuthal Equal Area geos = Geostationary npstere = North-Polar Stereographic spstere = South-Polar Stereographic
latmin : minimum latitude for plot latmax : maximum latitude for plot lonmin : minimum longitude for plot lonmax : maximum longitude lat_0 : central latitude lon_0 : central longitude sym : matplotlib symbol symsize : symbol size in pts edge : markeredgecolor cmap : matplotlib color map res : resolution [c,l,i,h] for low/crude, intermediate, high boundinglat : bounding latitude sym : matplotlib symbol for plotting symsize : matplotlib symbol size for plotting names : list of names for lats/lons (if empty, none will be plotted) pltgrd : if True, put on grid lines padlat : padding of latitudes padlon : padding of longitudes gridspace : grid line spacing global : global projection [default is True] oceancolor : ‘azure’ landcolor : ‘bisque’ [choose any of the valid color names for matplotlib
- detailsdictionary with keys:
coasts : if True, plot coastlines rivers : if True, plot rivers states : if True, plot states countries : if True, plot countries ocean : if True, plot ocean lakes : if True, plot lakes fancy : if True, plot etopo 20 grid
NB: etopo must be installed
- if Opts keys not set :these are the defaults:
Opts={‘latmin’:-90,’latmax’:90,’lonmin’:0,’lonmax’:360,’lat_0’:0,’lon_0’:0,’proj’:’moll’,’sym’:’ro,’symsize’:5,’edge’:’black’,’pltgrid’:1,’res’:’c’,’boundinglat’:0.,’padlon’:0,’padlat’:0,’gridspace’:30,’details’:all False,’edge’:None,’cmap’:’jet’,’fancy’:0,’zone’:’’,’south’:False,’oceancolor’:’azure’,’landcolor’:’bisque’}
- pmagpy.pmagplotlib.plot_net(fignum)[source]#
draws circle and tick marks for equal area projection :param fignum: :type fignum: matplotlib figure number
- pmagpy.pmagplotlib.plot_np(fignum, indata, s, units)[source]#
makes plot of de(re)magnetization data for Thellier-Thellier type experiment
- Parameters:
fignum (matplotlib figure number)
indata (araiblock from, e.g., pmag.sortarai())
s (specimen name)
units ([K, J, ""] (kelvin, joules, unknown))
Effect
_______
plot (Makes a)
- pmagpy.pmagplotlib.plot_qq_exp(fignum, I, title, subplot=False)[source]#
plots data against an exponential distribution in 0=>90.
- Parameters:
fignum (matplotlib figure number)
I (data)
title (plot title)
subplot (boolean, if True plot as subplot with 1 row, two columns with fignum the plot number)
- pmagpy.pmagplotlib.plot_qq_norm(fignum, Y, title)[source]#
makes a Quantile-Quantile plot for data :param fignum: :type fignum: matplotlib figure number :param Y: :type Y: list or array of data :param title: :type title: title string for plot
- Returns:
d,dc – if d>dc, likely to be normally distributed (95% confidence)
- Return type:
the values for D and Dc (the critical value)
- pmagpy.pmagplotlib.plot_qq_unf(fignum, D, title, subplot=False, degrees=True)[source]#
plots data against a uniform distribution in 0=>360. :param fignum: :type fignum: matplotlib figure number :param D: :type D: data :param title: :type title: title for plot :param subplot: :type subplot: if True, make this number one of two subplots :param degrees: :type degrees: if True, assume that these are degrees :param Return: :param Mu: :type Mu: Mu statistic (Fisher et al., 1987) :param Mu_crit: :type Mu_crit: critical value of Mu for uniform distribution :param Effect: :param ______: :param makes a Quantile Quantile plot of data:
- pmagpy.pmagplotlib.plot_s_bc(fignum, Bc, S, sym)[source]#
function to plot Squareness,Coercivity
- Parameters:
fignum (matplotlib figure number)
Bc (list or array coercivity values)
S (list or array of ratio of saturation remanence to saturation)
sym (matplotlib symbol (e.g., 'g^' for green triangles))
- pmagpy.pmagplotlib.plot_s_bcr(fignum, Bcr, S, sym)[source]#
function to plot Squareness,Coercivity of remanence
- Parameters:
fignum (matplotlib figure number)
Bcr (list or array coercivity of remenence values)
S (list or array of ratio of saturation remanence to saturation)
sym (matplotlib symbol (e.g., 'g^' for green triangles))
- pmagpy.pmagplotlib.plot_slnp(fignum, SiteRec, datablock, key)[source]#
plots lines and planes on a great circle with alpha 95 and mean deprecated (used in pmagplotlib)
- pmagpy.pmagplotlib.plot_square(fignum)[source]#
makes the figure square (equal axes) :param fignum: :type fignum: matplotlib figure number
- pmagpy.pmagplotlib.plot_strat(fignum, data, labels)[source]#
plots a time/depth series :param fignum: :type fignum: matplotlib figure number :param data: :type data: nested list of [X,Y] pairs :param labels: :type labels: [xlabel, ylabel, title]
- pmagpy.pmagplotlib.plot_teq(fignum, araiblock, s, pars)[source]#
plots directions of pTRM steps and zero field steps
- Parameters:
fignum (figure number for matplotlib object)
araiblock (nested list of data from pmag.sortarai())
s (specimen name)
pars (default is "",) – otherwise is dictionary with keys: ‘measurement_step_min’ and ‘measurement_step_max’
Effects
_______
symbols (makes the equal area projection with color coded) – red circles: ZI steps blue squares: IZ steps yellow : pTRM steps
- pmagpy.pmagplotlib.plot_ts(ax, agemin, agemax, step=1.0, timescale='gts20', ylabel='Age (Ma)')[source]#
This function makes a time scale plot between specified ages, using timescales as defined in pmag.get_ts(). The maximum possible age is ca. 83 Ma.
Parameters: ax : figure object agemin : (float) Minimum age for timescale in Ma agemax : (float) Maximum age for timescale in Ma step : (float) Y tick label spacing in Ma timescale : (string) polarity time scale, default is gts20 (Gradstein et al. 2020), other options ck95, gts04, gts20 ylabel : (string) if set, plot as ylabel
- Returns:
figure object
Example
Creates time scale plot from 0.5 to 5.5 Ma using the gts12 timescale:
>>> fig=plt.figure(figsize=(9,12)) >>> ax=fig.add_subplot(121) >>> pmagplotlib.plot_ts(ax, 0.5, 5.5, timescale='gts12')
- pmagpy.pmagplotlib.plot_vs(fignum, Xs, c, ls)[source]#
plots vertical lines at Xs values
- Parameters:
fignum (matplotlib figure number)
Xs (list of X values for lines)
c (color for lines)
ls (linestyle for lines)
- pmagpy.pmagplotlib.plot_xbt(fignum, XB, T, e, b)[source]#
function to plot series of chi measurements as a function of temperature, holding field constant and varying frequency
- pmagpy.pmagplotlib.plot_xft(fignum, XF, T, e, b)[source]#
function to plot series of chi measurements as a function of temperature, holding field constant and varying frequency
- pmagpy.pmagplotlib.plot_xtb(fignum, XTB, Bs, e, f)[source]#
function to plot series of chi measurements as a function of temperature, holding frequency constant and varying B
- pmagpy.pmagplotlib.plot_xtf(fignum, XTF, Fs, e, b)[source]#
function to plot series of chi measurements as a function of temperature, holding field constant and varying frequency
- pmagpy.pmagplotlib.plot_zed(ZED, datablock, angle, s, units)[source]#
function to make equal area plot and zijderveld plot
- Parameters:
ZED (dictionary with keys for plots) – eqarea : figure number for equal area projection zijd : figure number for zijderveld plot demag : figure number for magnetization against demag step datablock : nested list of [step, dec, inc, M (Am2), quality] step : units assumed in SI M : units assumed Am2 quality : [g,b], good or bad measurement; if bad will be marked as such
angle (angle for X axis in horizontal plane, if 0, x will be 0 declination)
s (specimen name)
units (SI units ['K','T','U'] for kelvin, tesla or undefined)
Effects
_______ – calls plotting functions for equal area, zijderveld and demag figures
- pmagpy.pmagplotlib.plot_zij(fignum, datablock, angle, s, norm=True)[source]#
function to make Zijderveld diagrams
- Parameters:
fignum (matplotlib figure number)
datablock (nested list of [step, dec, inc, M (Am2), type, quality]) – where type is a string, either ‘ZI’ or ‘IZ’ for IZZI experiments
angle (desired rotation in the horizontal plane (0 puts X on X axis))
s (specimen name)
norm (if True, normalize to initial magnetization = unity)
Effects
_______
plot (makes a zijderveld)
- pmagpy.pmagplotlib.qsnorm(p)[source]#
rational approximation for x where q(x)=d, q being the cumulative normal distribution function. taken from Abramowitz & Stegun p. 933 |error(x)| < 4.5*10**-4
- pmagpy.pmagplotlib.save_plots(Figs, filenames, dir_path=None, **kwargs)[source]#
- Parameters:
Figs (dict) – dictionary of plots, e.g. {‘eqarea’: 1, …}
filenames (dict) – dictionary of filenames, e.g. {‘eqarea’: ‘mc01a_eqarea.svg’, …} dict keys should correspond with Figs
dir_path (str) – string of directory name where plots will be saved to
kwargs (other keyword arguments)
pmagpy.pmag#
- pmagpy.pmag.Dir_anis_corr(InDir, AniSpec)[source]#
Depreciated 9/14/2022
Takes the 6 element ‘s’ vector and the Dec,Inc ‘InDir’ data, performs simple anisotropy correction. returns corrected Dec, Inc
- pmagpy.pmag.EI(inc)[source]#
Given a mean inclination value of a distribution of directions, this function calculates the expected elongation of this distribution using a best-fit polynomial of the TK03 GAD secular variation model (Tauxe and Kent, 2004).
- Parameters:
inc (Integer or float) – Inclination in degrees.
- Returns:
elongation
- Return type:
float
Examples
>>> pmag.EI(20) 2.4863973732 >>> pmag.EI(90) 1.0241570135500004
- pmagpy.pmag.PintPars(datablock, araiblock, zijdblock, start, end, accept, **kwargs)[source]#
Calculate the paleointensity with magic parameters and make some definitions.
- pmagpy.pmag.Tmatrix(X)[source]#
Gets the orientation matrix (T) from data in X.
- Parameters:
X (nested lists of input data)
- Returns:
T
- Return type:
orientation matrix as a nested list
Examples
>>> X = [[1., 0.8, 5.], [0.5, 0.2, 2.], [1.4, 0.6, 0.1]] >>> pmag.Tmatrix(X) [[3.21, 1.74, 6.14], [1.74, 1.04, 4.46], [6.14, 4.46, 29.01]]
- pmagpy.pmag.Vdiff(D1, D2)[source]#
Calculates the vector difference between two directions D1, D2.
- Parameters:
D1 (Direction 1 as an array of [declination, inclination] pair or pairs)
D2 (Direction 2 as an array of [declination, inclination] pair or pairs)
- Returns:
The vector difference between D1 and D2
- Return type:
array
Examples
>>> pmag.Vdiff([350.0,10.0],[320.0,20.0]) array([ 60.00000000000001 , -18.61064009110688 , 0.527588019973717])
- pmagpy.pmag.a2s(a)[source]#
Convert 3x3 a matrix to 6 element “s” list (see Tauxe 1998).
- Parameters:
a (3x3 matrix as an array)
- Returns:
s
- Return type:
list of six elements based on a
Examples
>>> pmag.a2s([[1, 4, 6], [4, 2, 5], [6, 5, 3]]) array([1., 2., 3., 4., 5., 6.], dtype=float32)
- pmagpy.pmag.add_flag(var, flag)[source]#
For use when calling command-line scripts from within a program. if a variable is present, add its proper command_line flag. return a string.
- pmagpy.pmag.adjust_all_to_360(dictionary)[source]#
Take a dictionary and check each key/value pair. If this key is of type: declination/longitude/azimuth/direction, adjust it to be within 0-360 as required by the MagIC data model
- pmagpy.pmag.adjust_to_360(val, key)[source]#
Take in a value and a key. If the key is of the type: declination/longitude/azimuth/direction, adjust it to be within the range 0-360 as required by the MagIC data model
- pmagpy.pmag.adjust_val_to_360(val)[source]#
Take in a single numeric (or null) argument. Return argument adjusted to be between 0 and 360 degrees.
- pmagpy.pmag.age_to_BP(age, age_unit)[source]#
Convert an age value into the equivalent in time Before Present(BP) where Present is 1950.
- Parameters:
age (age as a float)
age_unit (age unit as a str, valid strings:) – (Years AD (+/-), Years Cal AD (+/-), Years BP, ka, Ma, or Ga)
- Returns:
ageBP
- Return type:
age before present
- pmagpy.pmag.angle(D1, D2)[source]#
Calculate the angle between two directions.
- Parameters:
D1 (Direction 1 as an array of [declination, inclination] pair or pairs)
D2 (Direction 2 as an array of [declination, inclination] pair or pairs)
- Returns:
angle – angle between the input directions
- Return type:
single-element array
Examples
>>> pmag.angle([350.0,10.0],[320.0,20.0]) array([ 30.59060998])
>>> pmag.angle([[350.0,10.0],[320.0,20.0]],[[345,13],[340,14]]) array([ 5.744522410794302, 20.026413431433475])
- pmagpy.pmag.apseudo(Ss, ipar, sigma)[source]#
Depreciated: 9/14/2022
Draw a bootstrap sample of Ss.
- Parameters:
Ss (six element tensor as a list)
ipar (boolean (True, False, or zero value))
sigma (sigma of Ss)
- Returns:
BSs – bootstrap sample of Ss
- Return type:
array
Examples
>>> pmag.apseudo(np.array([2,2,1,6,1,1]),0,0) array([1, 2, 1, 2, 2, 1])
- pmagpy.pmag.apwp(data, print_results=False)[source]#
Calculates expected pole positions and directions for given plate, location and age.
- Parameters:
data ([plate,lat,lon,age]) –
- plate[NA, SA, AF, IN, EU, AU, ANT, GL]
NA : North America SA : South America AF : Africa IN : India EU : Eurasia AU : Australia ANT: Antarctica GL : Greenland lat/lon : latitude/longitude in degrees N/E age : age in millions of years
print_results (if True will print out nicely formatted results)
- Return type:
if print_results is False, [Age,Paleolat, Dec, Inc, Pole_lat, Pole_lon]
- pmagpy.pmag.b_vdm(B, lat)[source]#
Converts a magnetic field value of list of values to virtual dipole moment (VDM) or a virtual axial dipole moment (VADM).
Parameters B: local magnetic field strength in tesla, as a value or list of values lat: latitude of site in degrees
Returns VDM or V(A)DM in units of Am^2
Examples
>>> pmag.b_vdm(33e-6,22)*1e-21 71.58815974511788
- pmagpy.pmag.bc02(data)[source]#
Get APWP from Besse and Courtillot 2002 paper
- Parameters:
[plate (Takes input as)
site_lat (float)
site_lon (float)
age]
plate (string (options: AF, ANT, AU, EU, GL, IN, NA, SA))
site_lat
site_lon
age (float in Myr)
- Returns:
pole_lat (pole latitude)
pole_lon (pole longitude)
- pmagpy.pmag.binglookup(w1i, w2i)[source]#
Bingham statistics lookup table.
- Parameters:
w1i (initial values for w1 and w2)
w2i (initial values for w1 and w2)
- Returns:
k1,k2
- Return type:
k1 and k2 for Bingham distribution
Examples
>>> pmag.binglookup(0.12,0.15) (-4.868, -3.7289999999999996)
- pmagpy.pmag.calculate_best_fit_vectors(L, E, V, n_planes)[source]#
Calculates the best fit vectors for a set of plane interpretations used in fisher mean calculations.
- Parameters:
L (a list of the "EL, EM, EN" array of MM88 or the cartisian form of dec and inc of the plane interpretation)
E (the sum of the cartisian coordinates of all the line fits to be used in the mean)
V (initial direction to start iterating from to get plane best fits)
n_planes (number of planes)
- Returns:
XV
- Return type:
nested list of n_plane by 3 dimension where the 3 are the cartisian dimension of the best fit vector
- pmagpy.pmag.calculate_k(R, N)[source]#
Calculates the the Fisher concentration parameter (k) based on the number of vectors and the resultant vector length. This calculation occurs within the fisher_mean() function. Use of this function can be helpful when R and N are available, but the vectors themselves are not.
- Parameters:
R (Resultant vector length)
N (Number of vectors)
- Returns:
k
- Return type:
Fisher concentration parameter
Examples
>>> n,r = 3, 4.335 >>> pmag.calculate_k(r,n) -1.4981273408
- pmagpy.pmag.calculate_r(alpha95, N)[source]#
Calculates the resultant vector length (R) based on the number of vectors and provided Fisher alpha95. Doing so can be useful for conducting statistical tests that require R when it is not provided.
- Parameters:
alpha95 (Fisher alpha_95 value)
N (number of vectors)
- Returns:
R
- Return type:
resultant vector length
Examples
>>> alpha95, N = 6.41, 3 >>> pmag.calculate_r(alpha95,N) 2.994608233588127
- pmagpy.pmag.cart2dir(cart)[source]#
Converts a direction in cartesian coordinates into declinations and inclination.
- Parameters:
cart (list of [x,y,z] or list of lists [[x1,y1,z1],[x2,y2,z2]...])
- Returns:
direction_array
- Return type:
array of [declination, inclination, intensity]
Examples
>>> pmag.cart2dir([0,1,0]) array([ 90., 0., 1.])
- pmagpy.pmag.chart_maker(Int, Top, start=100, outfile='chart.txt')[source]#
Makes a chart for performing IZZI experiments. Print out the file and tape it to the oven. This chart will help keep track of the different steps. Z : performed in zero field - enter the temperature XXX.0 in the sio
formatted measurement file created by the LabView program
I : performed in the lab field written at the top of the form P : a pTRM step - performed at the temperature and in the lab field.
- Parameters:
Int (list of intervals [e.g., 50,10,5])
Top (list of upper bounds for each interval [e.g., 500, 550, 600])
start (first temperature step, default is 100)
outfile (name of output file, default is 'chart.txt')
- Returns:
file: write down the name of the measurement file field: write down the lab field for the infield steps (in uT) the type of step (Z: zerofield, I: infield, P: pTRM step temperature of the step and code for SIO-like treatment steps
XXX.0 [zero field] XXX.1 [in field] XXX.2 [pTRM check] - done in a lab field
date : date the step was performed run # : an optional run number zones I-III : field in the zones in the oven start : time the run was started sp : time the setpoint was reached cool : time cooling started
- Return type:
creates a file with
- pmagpy.pmag.circ(dec, dip, alpha, npts=201)[source]#
Calculates points on an circle about dec and dip with angle alpha.
- Parameters:
dec (float) – declination of vector
dip (float) – dip of vector
alpha (float) – angle of small circle - 90 if vector is pole to great circle
npts (int) – number of points on the circle, default 201
- Returns:
D_out, I_out – declinations and inclinations along small (great) circle about dec, dip
- Return type:
list
Examples
>>> pmag.circ(50,10,10,5) ([60.15108171104812, 50.0, 39.848918288951864, 49.99999999999999, 60.15108171104812], [9.846551939834077, 0.0, 9.846551939834077, 20.0, 9.846551939834079])
- pmagpy.pmag.cleanup(first_I, first_Z)[source]#
cleans up unbalanced steps failure can be from unbalanced final step, or from missing steps, this takes care of missing steps
- pmagpy.pmag.convert_ages(Recs, data_model=3)[source]#
Converts ages in a list of dictionaries to units of Millions of years ago, Ma.
Recs : list of dictionaries in data model by data_model data_model : MagIC data model (default is 3)
New : list of dictionaries with the converted ages
>>> sites = pd.read_csv('data_files/convert_ages/sites.txt',sep=' ',header=1) # create a dataframe from example file >>> sites_age = sites.dropna(subset=['age']) # drop all rows that have nan in the age column since our function does not work with nans >>> sites_dict = sites_age.to_dict('records') # 'records' to return list like values within dict >>> sites_ages_converted = pmag.convert_ages(sites_dict) >>> sites_ages_converted_df = pd.DataFrame.from_dict(sites_ages_converted) # convert new age converted list of dictionaries back to a dataframe >>> print('ORIGINAL FILE:
- ‘,sites_age[‘age’].head())
>>> print('CONVERTED AGES FILE:
- ‘,sites_ages_converted_df[‘age’].head());
ORIGINAL FILE: 1 100.0 3 625.0 5 625.0 7 750.0 9 800.0 Name: age, dtype: float64 CONVERTED AGES FILE:
0 1.9110e-03
1 1.3860e-03 2 1.3860e-03 3 1.2610e-03 4 1.2110e-03 Name: age, dtype: object
- pmagpy.pmag.convert_and_combine_2_to_3(dtype, map_dict, input_dir='.', output_dir='.', data_model=None)[source]#
Read in er_*.txt file and pmag_*.txt file in working directory. Combine the data, then translate headers from 2.5 –> 3.0. Last, write out the data in 3.0.
- Parameters:
dtype (string for input type (specimens, samples, sites, etc.))
map_dict (dictionary with format {header2_format: header3_format, ...} (from mapping.map_magic module))
input_dir (input directory, default ".")
output_dir (output directory, default ".")
data_model (data_model3.DataModel object, default None)
- Return type:
output_file_name with 3.0 format data (or None if translation failed)
- pmagpy.pmag.convert_criteria_file_2_to_3(fname='pmag_criteria.txt', input_dir='.', output_dir='.', data_model=None)[source]#
Convert a criteria file from 2.5 to 3.0 format and write it out to file
- Parameters:
fname (string of filename (default "pmag_criteria.txt"))
input_dir (string of input directory (default "."))
output_dir (string of output directory (default "."))
data_model (data_model.DataModel object (default None))
- Returns:
outfile (string output criteria filename, or False)
crit_container (cb.MagicDataFrame with 3.0 criteria table)
- pmagpy.pmag.convert_directory_2_to_3(meas_fname='magic_measurements.txt', input_dir='.', output_dir='.', meas_only=False, data_model=None)[source]#
Convert 2.0 measurements file into 3.0 measurements file. Merge and convert specimen, sample, site, and location data. Also translates criteria data.
- Parameters:
meas_name (name of measurement file (do not include full path,) – default is “magic_measurements.txt”)
input_dir (name of input directory (default is "."))
output_dir (name of output directory (default is "."))
meas_only (boolean, convert only measurement data (default is False))
data_model (data_model3.DataModel object (default is None))
- Returns:
NewMeas (3.0 measurements data (output of pmag.convert_items))
upgraded (list of files successfully upgraded to 3.0)
no_upgrade (list of 2.5 files not upgraded to 3.0)
- pmagpy.pmag.convert_items(data, mapping)[source]#
This function maps a given set of dictionsaries to the new given map and outputs an updated dictionary.
- Parameters:
data (list of dicts (each dict a record for one item))
mapping (mapping with column names to swap into the records)
- Returns:
new_recs
- Return type:
updated list of dicts
- pmagpy.pmag.convert_lat(Recs)[source]#
Uses lat, for age<5Ma, model_lat if present, else tries to use average_inc to estimate plat.
- Parameters:
Recs (list of dictionaries in data model by data_model) – This list of dictionaries must only include data with ages less than 5 Ma
- Returns:
New
- Return type:
list of dictionaries with plat estimate
- pmagpy.pmag.cross(v, w)[source]#
Cross product of two vectors.
- Parameters:
v (3 value vector list)
w (3 value vector list)
- Returns:
[x, y, z]
- Return type:
cross product resultant vector
Examples
>>> pmag.cross([3,6,0],[1,5,1]) [6, -3, 9]
- pmagpy.pmag.design(npos)[source]#
Make a design matrix for an anisotropy experiment.
- Parameters:
npos (number of measurement positions.) – either 15 or 6
- Returns:
A (design matrix array for the given number of positions)
B (suseptibilities array)
Examples
>>> pmag.design(10) measurement protocol not supported yet
>>> pmag.design(15) (array([[ 0.5, 0.5, 0. , -1. , 0. , 0. ], [ 0.5, 0.5, 0. , 1. , 0. , 0. ], [ 1. , 0. , 0. , 0. , 0. , 0. ], [ 0.5, 0.5, 0. , -1. , 0. , 0. ], [ 0.5, 0.5, 0. , 1. , 0. , 0. ], [ 0. , 0.5, 0.5, 0. , -1. , 0. ], [ 0. , 0.5, 0.5, 0. , 1. , 0. ], [ 0. , 1. , 0. , 0. , 0. , 0. ], [ 0. , 0.5, 0.5, 0. , -1. , 0. ], [ 0. , 0.5, 0.5, 0. , 1. , 0. ], [ 0.5, 0. , 0.5, 0. , 0. , -1. ], [ 0.5, 0. , 0.5, 0. , 0. , 1. ], [ 0. , 0. , 1. , 0. , 0. , 0. ], [ 0.5, 0. , 0.5, 0. , 0. , -1. ], [ 0.5, 0. , 0.5, 0. , 0. , 1. ]]), array([[ 0.15, 0.15, 0.4 , 0.15, 0.15, -0.1 , -0.1 , -0.1 , -0.1 , -0.1 , 0.15, 0.15, -0.1 , 0.15, 0.15], [ 0.15, 0.15, -0.1 , 0.15, 0.15, 0.15, 0.15, 0.4 , 0.15, 0.15, -0.1 , -0.1 , -0.1 , -0.1 , -0.1 ], [-0.1 , -0.1 , -0.1 , -0.1 , -0.1 , 0.15, 0.15, -0.1 , 0.15, 0.15, 0.15, 0.15, 0.4 , 0.15, 0.15], [-0.25, 0.25, 0. , -0.25, 0.25, 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. ], [ 0. , 0. , 0. , 0. , 0. , -0.25, 0.25, 0. , -0.25, 0.25, 0. , 0. , 0. , 0. , 0. ], [ 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , -0.25, 0.25, 0. , -0.25, 0.25]]))
- pmagpy.pmag.designAARM(npos)[source]#
Calculates B matrix for AARM calculations.
- Parameters:
npos (number of positions) – 9 is the only number of positions valid.
- Returns:
B (B matrix as an array)
H (Field directions)
tmpH (tmpH matrix)
- pmagpy.pmag.designATRM(npos)[source]#
Calculates B matrix for ATRM calculations.
- Parameters:
npos (number of positions) – 6 and greater number of positions valid.
- Returns:
B (B matrix as an array)
H (Field directions)
tmpH (tmpH matrix)
- pmagpy.pmag.di_boot(DIs, nb=5000)[source]#
Returns bootstrap means for Directional data.
- Parameters:
DIs (nested list of Dec,Inc pairs)
nb (number of bootstrap pseudosamples, default is 5000)
- Returns:
BDIs
- Return type:
nested list of bootstrapped mean Dec,Inc pairs
Examples
>>> di_block = ([[-45,150], [-40,150], [-38,145]]) >>> pmag.di_boot(di_block,5) [[136.66619627955163, 30.021001931432338], [138.33380372044837, 30.02100193143235], [140.64213759144877, 31.669596401508702], [136.66619627955163, 30.021001931432338], [139.58053739971953, 33.378250658618654]]
- pmagpy.pmag.dia_vgp(*args)[source]#
Converts directional data (declination, inclination, alpha95) at a given location (Site latitude, Site longitude) to pole position (pole longitude, pole latitude, dp, dm).
- Parameters:
- Returns:
if input is individual values for one pole the return is
pole longitude, pole latitude, dp, dm
if input is list of lists the return is
list of pole longitudes, list of pole latitudes, list of dp, list of dm
Examples
>>> pmag.dia_vgp(4, 41, 0, 33, -117) (41.68629415047637, 79.86259998889103, 0.0, 0.0)
- pmagpy.pmag.dimap(D, I)[source]#
Function to map directions to x,y pairs in equal area projection.
- Parameters:
D (list or array of declinations (as float))
I (list or array or inclinations (as float))
- Returns:
XY
- Return type:
x, y values of directions for equal area projection [x,y]
- pmagpy.pmag.dimap_V(D, I)[source]#
Maps declinations and inclinations into equal area projections.
- Parameters:
D (numpy arrays)
I (numpy arrays)
- Returns:
XY
- Return type:
array of equal area projections
Examples
>>> pmag.dimap_V([35,60,20],[70,80,-10]) array([[0.140856382055789, 0.20116376126988 ], [0.106743548942519, 0.061628416716219], [0.310909633795401, 0.85421719834377 ]])
- pmagpy.pmag.dir2cart(d)[source]#
Converts a list or array of vector directions in degrees (declination, inclination) to an array of the direction in cartesian coordinates (x,y,z).
- Parameters:
d (list or array of [dec,inc] or [dec,inc,intensity])
- Returns:
cart
- Return type:
array of [x,y,z]
Examples
>>> pmag.dir2cart([200,40,1]) array([-0.71984631, -0.26200263, 0.64278761])
>>> pmag.dir2cart([200,40]) array([[-0.719846310392954, -0.262002630229385, 0.642787609686539]])
>>> data = np.array([ [16.0, 43.0, 21620.33], [30.5, 53.6, 12922.58], [6.9, 33.2, 15780.08], [352.5, 40.2, 33947.52], [354.2, 45.1, 19725.45]]) >>> pmag.dir2cart(data) array([[15199.574113612794 , 4358.407742577491 , 14745.029604010038 ], [ 6607.405832448041 , 3892.0594770716 , 10401.304487835589 ], [13108.574245285025 , 1586.3117853121191, 8640.591471770322 ], [25707.154931463603 , -3384.411152593326 , 21911.687763162565 ], [13852.355235322588 , -1407.0709331498472, 13972.322052043308 ]])
- pmagpy.pmag.dir_df_boot(dir_df, nb=5000, par=False)[source]#
Performs a bootstrap for direction DataFrame with optional parametric bootstrap
- Parameters:
dir_df (Pandas DataFrame with columns:) –
- dir_decmean declination
dir_inc : mean inclination
- Required for parametric bootstrap
dir_n : number of data points in mean dir_k : Fisher k statistic for mean
nb (number of bootstraps, default is 5000)
par (if True, do a parameteric bootstrap)
- Returns:
BDIs
- Return type:
nested list of bootstrapped mean Dec,Inc pairs
Examples
>>> dir_df = pd.DataFrame() >>> dir_df['dir_inc'] = [30,75,-113,-127,104] >>> dir_df['dir_dec'] = [50,100,78,48,87] >>> pmag.dir_df_boot(dir_df,nb=4) [[249.0092732274716, 47.78774025782868], [52.15562660104691, 14.523345688004293], [214.5976992675414, 49.79280429500907], [119.6384153360684, 86.17066958304461]]
>>> dir_df['dir_n'] = [4,15,2,36,55] >>> dir_df['dir_k'] = [1.2,3.0,0.4,0.4,0.8] >>> pmag.dir_df_boot(dir_df,3,par=True) [[43.54318517848151, 53.40671924110994], [278.5836582875345, 25.159165079114043], [276.59474232833645, -22.88695795286902]]
- pmagpy.pmag.dir_df_fisher_mean(dir_df)[source]#
Calculates fisher mean for Pandas dataframe.
- Parameters:
dir_df (pandas data frame with columns:) – dir_dec : declination dir_inc : inclination
- Returns:
fpars – dec : mean declination inc : mean inclination r : resultant vector length n : number of data points k : Fisher k value csd : Fisher circular standard deviation alpha95 : Fisher circle of 95% confidence
- Return type:
dictionary containing the Fisher mean and statistics
- pmagpy.pmag.dms2dd(d)[source]#
Converts a list or array of degree, minute, second locations to an array of decimal degrees.
d : list or array of [deg, min, sec]
d : input list or array dd : int
decimal degree corresponding to d
>>> pmag.dms2dd([60,35,15]) 60 35 15 array(60.587500000000006)
>>> data = np.array([ [16.0, 43.0, 33], [30.5, 53.6, 58], [6.9, 33.2, 8], [352.5, 40.2, 52], [354.2, 45.1, 45]]) >>> pmag.dms2dd(data) [ 16. 30.5 6.9 352.5 354.2] [43. 53.6 33.2 40.2 45.1] [33. 58. 8. 52. 45.]
- array([ 16.72583333333333 , 31.409444444444446, 7.455555555555557,
353.18444444444447 , 354.96416666666664 ])
- pmagpy.pmag.do_mag_map(date, lon_0=0, alt=0, file='', mod='cals10k', resolution='low')[source]#
Returns lists of declination, inclination and intensities for lat/lon grid for desired model and date.
- Parameters:
Era (date = Required date in decimal years (Common)
NB (negative for Before Common Era) -)
Parameters (Optional)
-------------------
('arch3k' (mod = model to use)
'cals3k'
'pfm9k'
'hfm10k'
'cals10k.2'
'shadif14k'
'cals10k.1b'
'custom')
model (file = l m g h formatted filefor custom)
lon_0 (central longitude for Hammer projection)
altitude (alt =)
['low' (resolution =)
low ('high'] default is)
- Returns:
Bdec=list of declinations
Binc=list of inclinations
B = list of total field intensities in nT
Br = list of radial field intensities
lons = list of longitudes evaluated
lats = list of latitudes evaluated
- pmagpy.pmag.doaniscorr(PmagSpecRec, AniSpec)[source]#
Depreciated 9/14/2022
Takes the 6 element ‘s’ vector and the Dec,Inc, Int ‘Dir’ data, performs simple anisotropy correction. returns corrected Dec, Inc, Int
- pmagpy.pmag.dobingham(di_block)[source]#
Calculates the Bingham mean and associated statistical parameters from directions that are input as a di_block.
- Parameters:
di_block (a nested list of [dec,inc] or [dec,inc,intensity])
- Returns:
bpars (dictionary containing the Bingham mean and associated statistics)
dictionary keys – dec : mean declination inc : mean inclination n : number of datapoints Eta : major ellipse Edec : declination of major ellipse axis Einc : inclination of major ellipse axis Zeta : minor ellipse Zdec : declination of minor ellipse axis Zinc : inclination of minor ellipse axis
- pmagpy.pmag.docustom(lon, lat, alt, gh)[source]#
Passes the coefficients to the Malin and Barraclough routine (function pmag.magsyn) to calculate the field from the coefficients.
- Parameters:
lon (east longitude in degrees (0 to 360 or -180 to 180))
lat (latitude in degrees (-90 to 90))
alt (height above mean sea level in km (itype = 1 assumed))
gh (list of gauss coefficients)
- Returns:
x (north component of the magnetic field in nT)
y (east component of the magnetic field in nT)
z (downward component of the magnetic field in nT)
f (total magnetic field in nT)
Examples
>>> gh = pmag.doigrf(30,70,10,2022,coeffs=True) >>> pmag.docustom(30,70,10,gh) (10033.695088989529, 2822.610862622648, 53170.834174096184, 54182.8365443324)
- pmagpy.pmag.dodirot(D, I, Dbar, Ibar)[source]#
Rotate a direction (declination, inclination) by the difference between dec = 0 and inc = 90 and the provided desired mean direction.
- Parameters:
D (declination to be rotated)
I (inclination to be rotated)
Dbar (declination of desired mean)
Ibar (inclination of desired mean)
- Returns:
drot, irot
- Return type:
rotated declination and inclination
Examples
>>> pmag.dodirot(0,90,5,85) (5.0, 85.0)
- pmagpy.pmag.dodirot_V(di_array, Dbar, Ibar)[source]#
Rotate an array of declination, inclination pairs by the difference between dec = 0 and inc = 90 and the provided desired mean direction
- Parameters:
di_array (numpy array of [[Dec1,Inc1],[Dec2,Inc2],....])
Dbar (declination of desired mean)
Ibar (declination of desired mean)
- Returns:
Rotated decs and incs: [[rot_Dec1,rot_Inc1],[rot_Dec2,rot_Inc2],….]
- Return type:
array
Examples
>>> di_array = np.array([[0,90],[0,90],[0,90]]) >>> pmag.dodirot_V(di_array,5,15) array([[ 5. , 15.000000000000002], [ 5. , 15.000000000000002], [ 5. , 15.000000000000002]])
- pmagpy.pmag.doeigs_s(tau, Vdirs)[source]#
Gets elements of s from eigenvaulues - note that this is very unstable.
- Parameters:
tau (3 element array) – list of eigenvalues in decreasing order: [t1,t2,t3]
V (list of the eigenvector directions) – [[V1_dec,V1_inc],[V2_dec,V2_inc],[V3_dec,V3_inc]]
- Returns:
s = [x11,x22,x33,x12,x23,x13]
- Return type:
The six tensor elements as a list
Examples
>>> pmag.doeigs_s([2.2, -0.33, -0.68], [[44.59, 40.45], [295.45, 21.04], [185.08, 42.13]]) array([0.22194667, 0.3905577 , 0.57749563, 0.7154779 , 0.8923144 , 1.0629525 ], dtype=float32)
- pmagpy.pmag.doeqdi(x, y, UP=False)[source]#
Takes digitized x,y, data and returns the dec,inc, assuming an equal area projection.
- Parameters:
x (array of digitized x from point on equal area projection)
y (array of igitized y from point on equal area projection)
UP (if True, is an upper hemisphere projection)
- Returns:
dec (declination)
inc (inclination)
- pmagpy.pmag.doflip(dec, inc)[source]#
Flips upper hemisphere data to lower hemisphere.
- Parameters:
dec (float) – declination
inc (float) – inclination
- Returns:
containing the flipped declination and inclination
- Return type:
tuple
Examples
>>> pmag.doflip(30,-45) (210.0, 45)
- pmagpy.pmag.dogeo(dec, inc, az, pl)[source]#
Rotates declination and inclination into geographic coordinates using the azimuth and plunge of the X direction (lab arrow) of a specimen.
- Parameters:
dec (declination in specimen coordinates)
inc (inclination in specimen coordinates)
- Returns:
rotated_direction
- Return type:
tuple of declination, inclination in geographic coordinates
Examples
>>> pmag.dogeo(0.0,90.0,0.0,45.5) (180.0, 44.5)
- pmagpy.pmag.dogeo_V(indat)[source]#
Rotates declination and inclination into geographic coordinates using the azimuth and plunge of the X direction (lab arrow) of a specimen.
- Parameters:
indat (array of lists) – data format: [dec, inc, az, pl]
- Returns:
an array of declinations an array of inclinations
- Return type:
two arrays
Examples
>>> pmag.dogeo_V(np.array([[0.0,90.0,0.0,45.5],[0.0,90.0,0.0,45.5]])) (array([180., 180.]), array([44.5, 44.5]))
- pmagpy.pmag.dohext(nf, sigma, s)[source]#
Calculates hext parameters for nf, sigma and s.
- Parameters:
nf (number of degrees of freedom (measurements - 6))
sigma (the sigma of the measurements)
s ([x11,x22,x33,x12,x23,x13] - the six tensor elements)
- Returns:
hpars (dictionary of Hext statistics with keys:) – ‘F_crit’ : critical value for anisotropy ‘F12_crit’ : critical value for tau1>tau2, tau2>3 ‘F’ : value of F ‘F12’ : value of F12 ‘F23’ : value of F23 ‘v1_dec’: declination of principal eigenvector ‘v1_inc’: inclination of principal eigenvector ‘v2_dec’: declination of major eigenvector ‘v2_inc’: inclination of major eigenvector ‘v3_dec’: declination of minor eigenvector ‘v3_inc’: inclination of minor eigenvector ‘t1’: principal eigenvalue ‘t2’: major eigenvalue ‘t3’: minor eigenvalue ‘e12’: angle of confidence ellipse of principal eigenvector in direction of major eigenvector ‘e23’: angle of confidence ellipse of major eigenvector in direction of minor eigenvector ‘e13’: angle of confidence ellipse of principal eigenvector in direction of minor eigenvector
If working with data set with no sigmas and the average is desired, use nf,sigma,avs=pmag.sbar(Ss) as input
Examples
>>> pmag.dohext(30, 0.00027464, [0.33586472,0.32757074,0.33656454,0.0056526,0.00449771,-0.00036542]) {'F_crit': '2.5335', 'F12_crit': '3.3158', 'F': 820.3194287677485, 'F12': 74.97208429827333, 'F23': 1167.2979118918333, 'v1_dec': 38.360480228001826, 'v1_inc': 36.10621428141474, 'v2_dec': 183.62757676112915, 'v2_inc': 48.41031537341878, 'v3_dec': 294.8243200339332, 'v3_inc': 17.791534673908338, 't1': 0.33999866, 't2': 0.33663565, 't3': 0.3233657, 'e12': 6.002663418693858, 'e23': 1.5264872237415046, 'e13': 1.2179522275647792}
- pmagpy.pmag.doigrf(lon, lat, alt, date, **kwargs)[source]#
Calculates the interpolated (<=2020) or extrapolated (>2020) main field and secular variation coefficients and passes them to the Malin and Barraclough routine (function pmag.magsyn) to calculate the field from the coefficients.
- Parameters:
lon (east longitude in degrees (0 to 360 or -180 to 180))
lat (latitude in degrees (-90 to 90))
alt (height above mean sea level in km (itype = 1 assumed))
date (Required date in years and decimals of a year (A.D.))
Parameters (Optional)
-------------------
coeffs (if True, then return the gh coefficients)
mod (model to use ('arch3k','cals3k','pfm9k','hfm10k','cals10k.2','cals10k.1b','shadif14k','shawq2k','shawqIA')) –
arch3k (Korte et al., 2009) cals3k (Korte and Constable, 2011) cals10k.1b (Korte et al., 2011) pfm9k (Nilsson et al., 2014) hfm.OL1.A1 (Constable et al., 2016) cals10k.2 (Constable et al., 2016) shadif14k (Pavon-Carrasco et al., 2014) shawq2k (Campuzano et al., 2019) shawqIA (Osete et al., 2020) ggf100k (Panofska et al., 2018) [in 200 year increments from -99950 to 1850 only]
- NBthe first four of these models, are constrained to agree
with gufm1 (Jackson et al., 2000) for the past four centuries
- Returns:
x (north component of the magnetic field in nT)
y (east component of the magnetic field in nT)
z (downward component of the magnetic field in nT)
f (total magnetic field in nT)
gh (list of gauss coefficients) – only if coeffs=True
By default, igrf13 coefficients are used between 1900 and 2020
from http (//www.ngdc.noaa.gov/IAGA/vmod/igrf.html.)
To check the results you can run the interactive program at the NGDC www.ngdc.noaa.gov/geomag-web
Examples
>>> pmag.doigrf(30,70,10,2022) (10030.985358058582, 2797.0490284010084, 53258.99275624336, 54267.52675339505)
>>> pmag.doigrf(30,70,10,2022,coeffs=True) array([-2.94048e+04, -1.45090e+03, 4.65250e+03, -2.49960e+03, 2.98200e+03, -2.99160e+03, 1.67700e+03, -7.34600e+02, 1.36320e+03, -2.38120e+03, -8.21000e+01, 1.23620e+03, 2.41900e+02, 5.25700e+02, -5.43400e+02, 9.03000e+02, 8.09500e+02, 2.81900e+02, 8.63000e+01, -1.58400e+02, -3.09400e+02, 1.99700e+02, 4.80000e+01, -3.49700e+02, ...
- pmagpy.pmag.doincfish(inc)[source]#
Calculates Fisher mean inclination from inclination-only data. This function uses the method of McFadden and Reid (1982), and incorporates asymmetric confidence limits after McElhinny and McFadden (2000).
- Parameters:
inc (list of inclination values)
- Returns:
‘n’ : number of inclination values supplied ‘ginc’ : gaussian mean of inclinations ‘inc’ : estimated Fisher mean ‘r’ : estimated Fisher R value ‘k’ : estimated Fisher kappa ‘alpha95’: estimated confidence limit ‘upper_confidence_limit’ : estimated upper confidence limit of inclination ‘lower_confidence_limit’ : estimated lower confidence limit of inclination ‘csd’ : estimated circular standard deviation
- Return type:
dict
Examples
>>> pmag.doincfish([62.4, 61.6, 50.2, 65.2, 53.2, 61.4, 74.0, 60.0, 52.6, 71.8]) {'n': 10, 'ginc': 61.239999999999995, 'inc': 62.18, 'r': 9.828974184785405, 'k': 52.623634558953846, 'upper_confidence_limit': 66.49823541535572, 'lower_confidence_limit': 55.9733682324565, 'alpha95': 5.2624335914496125, 'csd': 11.165922232016465}
- pmagpy.pmag.dok15_s(k15)[source]#
Calculates least-squares matrix for 15 measurements from Jelinek [1976].
- Parameters:
k15 (k15 value)
- Returns:
sbar (array of six 15 element tensors)
sigma (array of sigma, standard deviation, of the measurement)
bulk (array of bulk susptibility)
Examples
>>> pmag.dok15_s(0.5) (array([[ 0.75, 0.75, 2. , 0.75, 0.75, -0.5 , -0.5 , -0.5 , -0.5 , -0.5 , 0.75, 0.75, -0.5 , 0.75, 0.75], [ 0.75, 0.75, -0.5 , 0.75, 0.75, 0.75, 0.75, 2. , 0.75, 0.75, -0.5 , -0.5 , -0.5 , -0.5 , -0.5 ], [-0.5 , -0.5 , -0.5 , -0.5 , -0.5 , 0.75, 0.75, -0.5 , 0.75, 0.75, 0.75, 0.75, 2. , 0.75, 0.75], [-1.25, 1.25, 0. , -1.25, 1.25, 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. ], [ 0. , 0. , 0. , 0. , 0. , -1.25, 1.25, 0. , -1.25, 1.25, 0. , 0. , 0. , 0. , 0. ], [ 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , -1.25, 1.25, 0. , -1.25, 1.25]]), array([6.101001739241042, 6.101001739241042, 6.101001739241042, 6.101001739241042, 6.101001739241042, 6.101001739241042, 6.101001739241042, 6.101001739241042, 6.101001739241042, 6.101001739241042, 6.10100173924104 , 6.10100173924104 , 6.101001739241042, 6.10100173924104 , 6.10100173924104 ]), array([0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333, 0.033333333333333]))
- pmagpy.pmag.dokent(data, NN, distribution_95=False)[source]#
Gets Kent parameters for data.
- Parameters:
data (nested pairs of [Dec,Inc])
NN (normalization) – Number of data for Kent ellipse NN is 1 for Kent ellipses of bootstrapped mean directions
distribution_95 (the default behavior (distribution_95=False) is for) – the function to return the confidence region for the mean direction. if distribution_95=True what instead will be returned is the parameters associated with the region containing 95% of the directions.
- Returns:
dec : mean declination inc : mean inclination n : number of datapoints Zeta : major ellipse Zdec : declination of major ellipse axis Zinc : inclination of major ellipse axis Eta : minor ellipse Edec : declination of minor ellipse axis Einc : inclination of minor ellipse axis
- Return type:
kpars dictionary keys
- pmagpy.pmag.dolnp(data, direction_type_key)[source]#
Returns fisher mean, a95 for data using the method of McFadden and McElhinny 1988 for lines and planes.
- Parameters:
Data (nested list of dictionaries with keys) –
- Data model 3.0:
dir_dec dir_inc dir_tilt_correction method_codes
- Data model 2.5:
dec inc tilt_correction magic_method_codes
direction_type_key (['specimen_direction_type'])
- Returns:
ReturnData (dictionary with keys) – dec : fisher mean dec of data in Data inc : fisher mean inc of data in Data n_lines : number of directed lines [method_code = DE-BFL or DE-FM] n_planes : number of best fit planes [method_code = DE-BFP] alpha95 : fisher confidence circle from Data R : fisher R value of Data K : fisher k value of Data
Effects
——-
prints to screen in case of no data
- pmagpy.pmag.dolnp3_0(Data)[source]#
DEPRECATED!! USE dolnp() Description: takes a list of dicts with the controlled vocabulary of 3_0 and calls dolnp on them after reformating for compatibility. :param Data: dir_dec
dir_inc dir_tilt_correction method_codes
- Returns:
ReturnData – dec : fisher mean dec of data in Data inc : fisher mean inc of data in Data n_lines : number of directed lines [method_code = DE-BFL or DE-FM] n_planes : number of best fit planes [method_code = DE-BFP] alpha95 : fisher confidence circle from Data R : fisher R value of Data K : fisher k value of Data
- Return type:
dictionary with keys
- Effects
prints to screen in case of no data
- pmagpy.pmag.domean(data, start, end, calculation_type)[source]#
Gets average direction using Fisher or principal component analysis (line or plane) methods.
- Parameters:
data (nest list of data) – eg. [[treatment,dec,inc,int,quality],…]
start (step being used as start of fit (often temperature minimum))
end (step being used as end of fit (often temperature maximum))
calculation_type (string describing type of calculation to be made)
(line) ('DE-BFL')
(line-anchored) ('DE-BFL-A')
(line-with-origin) ('DE-BFL-O')
:param : :param ‘DE-BFP’ (plane): :param ‘DE-FM’ (Fisher mean):
- Returns:
mpars – The keys within are “specimen_n”,”measurement_step_min”, “measurement_step_max”,”specimen_mad”,”specimen_dec”,”specimen_inc”.
- Return type:
dictionary
- pmagpy.pmag.doprinc(data)[source]#
Gets principal components from data in form of a list of [dec,inc,int] data.
- Parameters:
data (nested list of dec, inc and optionally intensity vectors)
- Returns:
ppars – dec : principal direction declination inc : principal direction inclination V2dec : intermediate eigenvector declination V2inc : intermediate eigenvector inclination V3dec : minor eigenvector declination V3inc : minor eigenvector inclination tau1 : major eigenvalue tau2 : intermediate eigenvalue tau3 : minor eigenvalue N : number of points
- Return type:
dictionary with the principal components
- pmagpy.pmag.doreverse(dec, inc)[source]#
Calculates the antipode of a direction.
- Parameters:
dec (float) – declination
inc (float) – inclination
- Returns:
dec (float) – antipode of the declination
inc (float) – antipode of the inclination
Examples
>>> pmag.doreverse(30,45) (210.0, -45)
- pmagpy.pmag.doreverse_list(decs, incs)[source]#
Calculates the antipode of list of directions.
- Parameters:
decs (list of declinations)
incs (list of inclinations)
- Returns:
decs_flipped (antipode list of declinations)
incs_flipped (antipode list of inclinations)
Examples
>>> pmag.doreverse_list([30,32,70,54],[60,62,0,10]) ([210.0, 212.0, 250.0, 234.0], [-60, -62, 0, -10])
- pmagpy.pmag.doseigs(s)[source]#
Convert s format for eigenvalues and eigenvectors.
- Parameters:
s (the six tensor elements as a list) – (s=[x11,x22,x33,x12,x23,x13])
- Returns:
A three element array and a nested list of dec, inc pairs
tau (three element array ([t1,t2,t3])) – tau is an list of eigenvalues in decreasing order:
V (second array ([[V1_dec,V1_inc],[V2_dec,V2_inc],[V3_dec,V3_inc]])) – is an list of the eigenvector directions
Examples
>>> pmag.doseigs([1,2,3,4,5,6]) ([2.021399, -0.33896524, -0.6824337], [[44.59696385583322, 40.45122920806129], [295.4500678147439, 21.04129013670037], [185.0807541485627, 42.138918019674385]])
- pmagpy.pmag.dosgeo(s, az, pl)[source]#
Rotates matrix a to its azimuth and plunge.
- Parameters:
s ([x11,x22,x33,x12,x23,x13] - the six tensor elements)
az (the azimuth of the specimen X direction)
pl (the plunge (inclination) of the specimen X direction)
- Returns:
s_rot
- Return type:
[x11,x22,x33,x12,x23,x13] after rotation
Examples
>>> pmag.dosgeo([0.33586472,0.32757074,0.33656454,0.0056526,0.00449771,-0.00036542],12,33) array([ 0.33509237 , 0.3288845 , 0.33602312 , 0.0038898108, 0.0066036563, -0.0018823999], dtype=float32)
- pmagpy.pmag.dostilt(s, bed_az, bed_dip)[source]#
Rotates “s” tensor to stratigraphic coordinates
- Parameters:
s ([x11,x22,x33,x12,x23,x13] - the six tensor elements)
bed_az (bedding dip direction)
bed_dip (bedding dip)
- Returns:
s_rot
- Return type:
[x11,x22,x33,x12,x23,x13] - after rotation
Examples
>>> pmag.dostilt([0.33586472,0.32757074,0.33656454,0.0056526,0.00449771,-0.00036542],20,38) array([ 0.33473614 , 0.32911453 , 0.33614933 , 0.0075679934, 0.0020322995, -0.0014457355], dtype=float32)
- pmagpy.pmag.dosundec(sundata)[source]#
Returns the declination for a given set of suncompass data.
- Parameters:
sundata (dictionary with these keys:) – date: time string with the format ‘yyyy:mm:dd:hr:min’ delta_u: time to SUBTRACT from local time for Universal time lat: latitude of location (negative for south) lon: longitude of location (negative for west) shadow_angle: shadow angle of the desired direction with respect to the sun.
- Returns:
sunaz
- Return type:
the declination of the desired direction with respect to true north
Examples
>>> sundata={'date':'1994:05:23:16:9','delta_u':3,'lat':35,'lon':33,'shadow_angle':68} >>> pmag.dosundec(sundata) 154.24420046668928
- pmagpy.pmag.dotilt(dec, inc, bed_az, bed_dip)[source]#
Does a tilt correction on a direction (dec,inc) using bedding dip direction and bedding dip.
- Parameters:
dec (declination directions in degrees)
inc (inclination direction in degrees)
bed_az (bedding dip direction)
bed_dip (bedding dip)
- Returns:
dec,inc
- Return type:
a tuple of rotated dec, inc values
Examples
>>> pmag.dotilt(91.2,43.1,90.0,20.0) (90.952568837153436, 23.103411670066617)
- pmagpy.pmag.dotilt_V(indat)[source]#
Does a tilt correction on an array with rows of [dec, inc, bedding dip direction, bedding dip].
- Parameters:
indat (nested array of [[dec1, inc1, bed_az1, bed_dip1],[dec2,inc2,bed_az2,bed_dip2]...]) – declination, inclination, bedding dip direction and bedding dip
- Returns:
dec, inc
- Return type:
arrays of rotated declination, inclination
Examples
>>> pmag.dotilt_V(np.array([[91.2,43.1,90.0,20.0],[92.0,40.4,90.5,21.3]])) (array([90.95256883715344, 91.70884991139725]), array([23.103411670066613, 19.105747819853423]))
- pmagpy.pmag.dovandamme(vgp_df)[source]#
Determine the S_b value for VGPs using the Vandamme (1994) method for determining cutoff value for “outliers”.
- Parameters:
vgp_df (pandas DataFrame with required column "vgp_lat") – This should be in the desired coordinate system and assumes one polarity
- Returns:
vgp_df (after applying cutoff)
cutoff (colatitude cutoff)
S_b (S_b of vgp_df after applying cutoff)
- pmagpy.pmag.dovds(data)[source]#
Calculates vector difference sum for demagnetization data.
- Parameters:
data (nested array of data)
- Returns:
vds
- Return type:
vector difference of data as a float
Examples
>>> data = np.array([ [16.0, 43.0, 21620.33], [30.5, 53.6, 12922.58], [6.9, 33.2, 15780.08], [352.5, 40.2, 33947.52], [354.2, 45.1, 19725.45]]) >>> pmag.dovds(data) 69849.6597634
- pmagpy.pmag.dread(infile, cols)[source]#
Depreciated 9/14/2022
Reads in specimen, tr, dec, inc int into data[]. position of tr, dec, inc, int determined by cols[]
- pmagpy.pmag.execute(st, **kwargs)[source]#
Work around for Python3 exec function which doesn’t allow changes to the local namespace because of scope. This breaks a lot of the old functionality in the code which was originally in Python2. So this function runs just like exec except that it returns the output of the input statement to the local namespace. It may break if you start feeding it multiline monoliths of statements (haven’t tested) but you shouldn’t do that anyway (bad programming).
- Parameters:
st (the statement you want executed and for which you want the return)
kwargs (anything that may need to be in this namespace to execute st)
- Return type:
The return value of executing the input statement
- pmagpy.pmag.fcalc(col, row)[source]#
Looks up an F-test stastic from F tables F(col,row), where row is number of degrees of freedom - this is 95% confidence (p=0.05).
- Parameters:
col (degrees of freedom column)
row (degrees of freedom row)
- Returns:
F
- Return type:
value for 95% confidence from the F-table
Examples
>>> pmag.fcalc(3,4.8) 6.5915
- pmagpy.pmag.fillkeys(Recs)[source]#
Reconciles keys of dictionaries within Recs.
- Parameters:
Recs (list of dictionaries in MagIC format OR pandas dataframe)
- Returns:
input Recs
keylist (list of keys found in Recs)
- pmagpy.pmag.find(f, seq)[source]#
Returns input value (f) if it is in the given array (seq).
- Parameters:
f (string value)
seq (array of strings)
- Return type:
String value ‘f’ if it is found in seq.
Examples
>>> A = ['11', '12', '13', '14'] >>> find('11',A) '11'
- pmagpy.pmag.find_CMDT_CR(Ahat, Tc, mhat12)[source]#
Find the sequence of points along the confidence region of the Common Mean Direction Test (CMDT-CR) of Heslop et al., 2023. Provides a collection of points on the boundary of the 1- 𝛼 confidence region for the common mean direction according to the procedure in Appendix B.
N.B: find_CMDT_CR should only be used if null hypothesis of common mean direction cannot be rejected
- Parameters:
Ahat (ndarray) – Combined covariance matrix.
Tc (float) – T value on the boundary of the confidence region.
mhat12 (ndarray) – Estimated common mean direction.
- Returns:
Sequence of points along the confidence region.
- Return type:
ndarray
- pmagpy.pmag.find_CR(mhat, Mhat, Ghat, n, Tc)[source]#
Calculates the closed confidence region boundary, mCI.
- Parameters:
mhat – numpy array representing the mean direction of the original data set.
Mhat – numpy matrix representing the Mhat matrix for mean direction.
Ghat – numpy matrix representing the covariance matrix.
n – int, number of observations.
Tc – float, critical T value on the confidence region boundary.
- Returns:
closed confidence region boundary, mCI.
- Return type:
numpy array
- Raises:
None –
- pmagpy.pmag.find_T(m, n, Mhat, Ghat)[source]#
Calculates the T value estimated from Equation 6.
- Parameters:
m – numpy matrix representing the direction under consideration.
n – int, number of observations.
Mhat – numpy matrix representing the Mhat matrix for mean direction.
Ghat – numpy matrix representing the covariance matrix.
- Returns:
T value estimated from Equation 6 of Heslop et al., 2023
- Return type:
numpy array
- Raises:
None –
- pmagpy.pmag.find_dmag_rec(s, data, **kwargs)[source]#
Returns demagnetization data for specimen s from the data. Excludes other kinds of experiments and “bad” measurements.
- Parameters:
s (specimen name)
data (DataFrame with measurement data)
**kwargs – version : if not 3, assume data model = 2.5
- Returns:
datablock (nested list of data for zijderveld plotting) – [[tr, dec, inc, int, ZI, flag],…] tr : treatment step dec : declination inc : inclination int : intensity ZI : whether zero-field first or infield-first step flag : g or b , default is set to ‘g’
units (list of units found [‘T’,’K’,’J’] for tesla, kelvin or joules)
- pmagpy.pmag.find_f(data)[source]#
Given a distribution of directions, this function determines parameters (elongation, inclination, flattening factor, and elongation direction) that are consistent with the TK03 secular variation model.
- Parameters:
data (array of declination, inclination pairs)
- Returns:
Es (list of elongation values)
Is (list of inclination values)
Fs (list of flattening factors)
V2s (list of elongation directions (relative to the distribution))
The function will return a zero list ([0]) for each of these parameters if the directions constitute a pathological distribution.
Examples
>>> directions = np.array([[140,21],[127,23],[142,19],[136,22]]) >>> Es, Is, Fs, V2s = pmag.find_f(directions)
- pmagpy.pmag.findrec(s, data)[source]#
Finds all the records belonging to s in data.
- Parameters:
s (str) – data value of interest
data (nested list of data) – eg. [[treatment,dec,inc,int,quality],…]
- Returns:
datablock
- Return type:
nested list of data relating to s
Examples
>>> data = [['treatment','dec','inc','int','quality'],['treatment1','dec1','inc1','int1','quality1']] >>> pmag.findrec('treatment', data) [['dec', 'inc', 'int', 'quality']]
- pmagpy.pmag.first_rec(ofile, Rec, file_type)[source]#
Opens the file ofile as a magic template file with headers as the keys to Rec.
- Parameters:
ofile (string with the path of the input file)
- pmagpy.pmag.fisher_by_pol(data)[source]#
Do fisher mean after splitting data into two polarity domains.
- Parameters:
data (list of dictionaries with 'dec' and 'inc')
- Returns:
‘A’= polarity ‘A’ ‘B = polarity ‘B’ ‘ALL’= switching polarity of ‘B’ directions, and calculate fisher mean of all data
- Return type:
three dictionaries
Examples
>>> data = [{'dec':-45,'inc':150}, {'dec':-44,'inc':150},{'dec':-45.3,'inc':149}] >>> pmag.fisher_by_pol(data) {'B': {'dec': 135.23515314555496, 'inc': 30.334504880687444, 'n': 3, 'r': 2.9997932987279383, 'k': 9675.799186195498, 'alpha95': 1.2533447889568254, 'csd': 0.8234582703442529, 'sites': '', 'locs': ''}, 'All': {'dec': 315.23515314555493, 'inc': -30.334504880687444, 'n': 3, 'r': 2.999793298727938, 'k': 9675.79918617471, 'alpha95': 1.2533447889582796, 'csd': 0.8234582703451375, 'sites': '', 'locs': ''}}
- pmagpy.pmag.fisher_mean(data)[source]#
Calculates the Fisher mean and associated parameter from a di_block.
- Parameters:
data (nested list of [dec,inc] or [dec,inc,intensity])
- Returns:
fpars – dec : mean declination inc : mean inclination r : resultant vector length n : number of data points k : Fisher k value csd : Fisher circular standard deviation alpha95 : Fisher circle of 95% confidence
- Return type:
dictionary containing the Fisher mean and statistics with keys
Examples
>>> data = [[150,-45],[151,-46],[145,-38],[146,-41]] >>> pmag.fisher_mean(data) {'dec': 147.87247771265734, 'inc': -42.52872729473035, 'n': 4, 'r': 3.9916088992115832, 'k': 357.52162626162925, 'alpha95': 4.865886096375297, 'csd': 4.283846101842065}
- pmagpy.pmag.fix_directories(input_dir_path, output_dir_path)[source]#
Take arguments input/output directories and fixes them. If no input_directory, default to output_dir_path for both. Then return realpath for both values.
- Parameters:
input_dir_path (str)
output_dir_path (str)
- Return type:
input_dir_path, output_dir_path
- pmagpy.pmag.flip(di_block, combine=False)[source]#
Determines ‘normal’ direction along the principle eigenvector, then flips the reverse mode to the antipode.
- Parameters:
di_block (nested list of directions)
combine (whether to return directions as one di_block (default is False))
- Returns:
D1 (normal mode)
D2 (flipped reverse mode as two DI blocks)
If combine=True one combined D1 + D2 di_block will be returned
- pmagpy.pmag.form_Ghat(X, Mhat)[source]#
Form the Ghat matrix based on a collection of directions X and the Mhat matrix according to Equation 5 of Heslop et al., 2023
- Parameters:
X (ndarray) – Cartesian coordinates of directions
Mhat (ndarray) – Mhat matrix for mean direction.
- Returns:
Ghat matrix according to equation 5 of Heslop et al., 2023.
- Return type:
ndarray
- pmagpy.pmag.form_Mhat(mhat)[source]#
Calculate the Mhat matrix based on data set according to Equation 4 of Heslop et al., 2023.
- Parameters:
mhat (ndarray) – Cartesian coordinates of estimated sample mean direction
- Returns:
Mhat matrix according to the equation 4 of Heslop et al., 2023.
- Return type:
ndarray
- Raises:
ValueError – If the data sets have incompatible shapes.
- pmagpy.pmag.form_Q(a, b)[source]#
Creates the rotation matrix Q so that Qb = a (according to equations (9) and (10)Heslop et al., 2023)
- Parameters:
a (ndarray) – Destination direction (unit vector).
b (ndarray) – Starting direction (unit vector).
- Returns:
Rotation matrix Q.
- Return type:
ndarray
- pmagpy.pmag.fshdev(k)[source]#
Generate a random draw from a Fisher distribution with mean declination of 0 and inclination of 90 with a specified kappa.
- Parameters:
k (single number or an array of values) – kappa (precision parameter) of the distribution
- Returns:
dec, inc – if k is an array, dec, inc are returned as arrays, otherwise, single values
- Return type:
declination and inclination of random Fisher distribution draw
Examples
>>> pmag.fshdev(8) (334.3434290469283, 61.06963783415771)
- pmagpy.pmag.gaussdev(mean, sigma, N=1)[source]#
Returns a number randomly drawn from a gaussian distribution with the given mean, sigma
Parmeters#
mean : mean of the gaussian distribution from which to draw deviates sigma : standard deviation of same N : number of deviates desired
- rtype:
N deviates from the normal distribution
Examples
>>> pmag.gaussdev(5.5,1.2,6) array([5.090856280215007, 3.305193918953536, 7.313490558588299, 5.412029315803913, 6.819820301799303, 7.632257251681613])
- pmagpy.pmag.gausspars(data)[source]#
Calculates gaussian statistics for data.
Parmeters#
data : array of data
- returns:
mean (array the length of the data array)
stdev (second array the length of the data array)
Examples
>>> data=np.loadtxt('data_files/vector_mean/vector_mean_example.dat') >>> pmag.gausspars(data) (array([ 154.72699999999995, 44.43599999999999, 23709.242399999992 ]), array([ 166.93766686153165 , 19.578257988354988, 11563.604723319804 ]))
>>> data = np.array([ [16.0, 43.0, 21620.33], [30.5, 53.6, 12922.58], [6.9, 33.2, 15780.08], [352.5, 40.2, 33947.52], [354.2, 45.1, 19725.45]]) >>> pmag.gausspars(data) (array([ 152.02, 43.019999999999996, 20799.192]), array([1.839818931308187e+02, 7.427112494098901e+00, 8.092252785230450e+03]))
- pmagpy.pmag.get_Sb(data)[source]#
Returns vgp scatter for a data set.
- Parameters:
data (data set as a list or a pandas dataframe)
- Return type:
float value of the vgp scatter
- pmagpy.pmag.get_age(Rec, sitekey, keybase, Ages, DefaultAge)[source]#
Finds the age record for a given site.
- pmagpy.pmag.get_azpl(cdec, cinc, gdec, ginc)[source]#
Gets azimuth and plunge from specimen declination, inclination, and (geographic) coordinates.
- Parameters:
cdec (specimen declination)
cinc (specimen inclination)
gdec (geographic declination)
ginc (geographic inclination)
- Return type:
list of the two values for the azmiuth and plunge
Examples
>>> pmag.get_azpl(85,110,80.2,112.3) (323.77999999985053, -12.079999999990653)
- pmagpy.pmag.get_dictitem(In, k, v, flag, float_to_int=False)[source]#
Returns a list of dictionaries from list In with key (k) = value (v) .
- CASE INSENSITIVE # allowed keywords:
requires that the value of k in the dictionaries contained in In be castable to string and requires that v be castable to a string if flag is T,F, has or not and requires they be castable to float if flag is eval, min, or max. float_to_int goes through the relvant values in In and truncates them, (like “0.0” to “0”) for evaluation, default is False
- Parameters:
In (list of dictionaries)
k (key to test)
v (key value to test)
flag ([T,F,has, or not])
int (float_to)
- Return type:
List of dictionaries that meet conditions
Examples
>>> In=[{'specimen':'abc01b01','dec':'10.3','inc':'43','int':'5.2e-6'}, {'specimen':'abc01b02','dec':'12.3','inc':'42','int':'4.9e-6'}] >>> k = 'specimen' >>> v = 'abc01b02' >>> flag='T' >>> get_dictitem(In,k,v,flag) [{'specimen': 'abc01b02', 'dec': '12.3', 'inc': '42', 'int': '4.9e-6'}]
- pmagpy.pmag.get_dictkey(In, k, dtype)[source]#
Returns list of given key (k) from input list of dictionaries (In) in data typed dtype.
- Parameters:
In (list of dictionaries to work on)
k (key to return)
dtype (str) – “” : returns string value “f” : returns float “int” : returns integer
- Returns:
Out
- Return type:
List of values of the key specified to return
Examples
>>> In=[{'specimen':'abc01b01','dec':'10.3','inc':'43','int':'5.2e-6'}, {'specimen':'abc01b02','dec':'12.3','inc':'42','int':'4.9e-6'}] >>> k = 'specimen' >>> dtype = '' >>> get_dictkey(In,k,dtype) ['abc01b01', 'abc01b02']
- pmagpy.pmag.get_named_arg(name, default_val=None, reqd=False)[source]#
Extract the value after a command-line flag such as ‘-f’ and return it. If the command-line flag is missing, return default_val. If reqd == True and the command-line flag is missing, throw an error.
- Parameters:
name (str) – command line flag, e.g. “-f”
default_val – value to use if command line flag is missing, e.g. “measurements.txt” default is None
reqd (bool) – throw error if reqd==True and command line flag is missing. if reqd == True, default_val will be ignored. default is False.
- Return type:
Desired value from sys.argv if available, otherwise default_val.
- pmagpy.pmag.get_orient(samp_data, er_sample_name, **kwargs)[source]#
Returns orientation and orientation method of input sample (er_sample_name).
- Parameters:
samp_data (PmagPy list of dicts or pandas DataFrame)
er_sample_name (string for the sample name)
- Return type:
Orientation data and corresponding orientation method of specified sample (er_sample_name).
- pmagpy.pmag.get_plate_data(plate)[source]#
Returns the pole list for a given plate
- Parameters:
plate (string (options: AF, ANT, AU, EU, GL, IN, NA, SA))
- Returns:
apwp – 0.0 90.00 0.00 1.0 88.38 182.20 2.0 86.76 182.20 …
- Return type:
string with format
- pmagpy.pmag.get_sb_df(df, mm97=False)[source]#
Calculates Sf for a dataframe with VGP Lat., and optional Fisher’s k, site latitude and N information can be used to correct for within site scatter (McElhinny & McFadden, 1997)
- Parameters:
df (Pandas Dataframe with columns) –
- REQUIRED:
vgp_lat : VGP latitude
- ONLY REQUIRED for MM97 correction:
dir_k : Fisher kappa estimate dir_n : number of specimens (samples) per site lat : latitude of the site
mm97 (if True, will do the correction for within site scatter)
- Returns:
Sf
- Return type:
float value for the Sf
- pmagpy.pmag.get_specs(data)[source]#
Takes a magic format file and returns a list of unique specimen names.
- pmagpy.pmag.get_test_WD()[source]#
Find proper working directory to run tests. With developer install, tests should be run from PmagPy directory. Otherwise, assume pip install, and run tests from sys.prefix, where data_files are installed by setuptools.
- pmagpy.pmag.get_tilt(dec_geo, inc_geo, dec_tilt, inc_tilt)[source]#
Function to return the dip direction and dip that would yield the tilt corrected direction if applied to the uncorrected direction (geographic coordinates).
- Parameters:
dec_geo (declination in geographic coordinates)
inc_geo (inclination in geographic coordinates)
dec_tilt (declination in tilt-corrected coordinates)
inc_tilt (inclination in tilt-corrected coordinates)
- Returns:
DipDir, Dip
- Return type:
tuple of dip direction and dip
Examples
>>> pmag.get_tilt(85,110,80.2,112.3) (223.67057238530975, 2.95374920443805)
- pmagpy.pmag.get_ts(ts)[source]#
returns GPTS timescales. options are: ck95, gts04, gts12, gts20 returns timescales and Chron labels
- pmagpy.pmag.get_unf(N=100)[source]#
Generates N uniformly distributed directions using the way described in Fisher et al. (1987).
- Parameters:
N (number of directions, default is 100)
- Return type:
array of nested dec, inc pairs
Examples
>>> pmag.get_unf(5) array([[ 62.916547703466684, -30.751721919151798], [145.94851610484855 , 76.45636268514875 ], [312.61910867788174 , -67.24338629811932 ], [ 61.71574344812653 , -4.005335509042522], [ 15.867001505749716, -1.404412703673322]])
- pmagpy.pmag.get_version()[source]#
Determines the version of PmagPy installed on your machine.
- Returns:
version
- Return type:
string of pmagpy version, such as “pmagpy-3.8.8”
Examples
>>> pmag.get_version() 'pmagpy-4.2.106'
- pmagpy.pmag.getmeths(method_type)[source]#
Returns MagIC method codes available for a given type.
- Parameters:
method_type (str)
- Returns:
meths
- Return type:
specified methods codes for the given type
- pmagpy.pmag.getvec(gh, lat, lon)[source]#
Evaluates the vector at a given latitude and longitude for a specified set of coefficients.
- Parameters:
gh (a list of gauss coefficients)
lat (latitude of location)
long (longitude of location)
- Returns:
vec
- Return type:
direction as an array [dec, inc, intensity]
Examples
>>> gh = pmag.doigrf(30,70,10,2022,coeffs=True) >>> pmag.getvec(gh, 30,70) array([2.007319473143944e+00, 4.740186709049829e+01, 4.831229434010185e+04])
- pmagpy.pmag.gha(julian_day, f)[source]#
Returns greenwich hour angle.
- Parameters:
julian_day (int, julian day)
f (int) – fraction of the day in Universal Time, (hrs + (min/60))/24
- Returns:
H (int, hour)
delta (int, angle)
Examples
>>> julianday = pmag.julian(10,20,2000) >>> pmag.gha(julianday, 33) (183.440612472039, -20.255315389871825) >>> pmag.gha(2451838, 33) (183.440612472039, -20.255315389871825)
- pmagpy.pmag.grade(PmagRec, ACCEPT, type, data_model=2.5)[source]#
Finds the ‘grade’ (pass/fail; A/F) of a record (specimen,sample,site) given the acceptance criteria
- pmagpy.pmag.import_cartopy()[source]#
Try to import cartopy and print out a help message if it is not installed
- Returns:
has_cartopy (bool)
cartopy (cartopy package if available else None)
- pmagpy.pmag.initialize_acceptance_criteria(**kwargs)[source]#
initialize acceptance criteria with NULL values for thellier_gui and demag_gui
acceptance criteria format is doctionaries:
- acceptance_criteria={}
- acceptance_criteria[crit]={}
acceptance_criteria[crit][‘category’]= acceptance_criteria[crit][‘criterion_name’]= acceptance_criteria[crit][‘value’]= acceptance_criteria[crit][‘threshold_type’] acceptance_criteria[crit][‘decimal_points’]
- ‘category’:
‘DE-SPEC’,’DE-SAMP’..etc
- ‘criterion_name’:
MagIC name
- ‘value’:
a number (for ‘regular criteria’) a string (for ‘flag’) 1 for True (if criteria is bullean) 0 for False (if criteria is bullean) -999 means N/A
- ‘threshold_type’:
‘low’for low threshold value ‘high’for high threshold value
[flag1.flag2]: for flags ‘bool’ for boolean flags (can be ‘g’,’b’ or True/Flase or 1/0)
- ‘decimal_points’:
number of decimal points in rounding (this is used in displaying criteria in the dialog box) -999 means Exponent with 3 descimal points for floats and string for string
- pmagpy.pmag.int_pars(x, y, vds, **kwargs)[source]#
Depreciated 9/7/2022
Calculates York regression and paleointensity parameters (with Tauxe Fvds).
- pmagpy.pmag.interval_overlap(interval_a, interval_b)[source]#
Determine the extent of overlap between two ranges of numbers
- Parameters:
interval_a (a list of [min, max])
interval_b (a list of [min, max])
- Returns:
overlap
- Return type:
the amount of overlap between interval_a and interval_b
- pmagpy.pmag.julian(mon, day, year)[source]#
Returns julian day.
- Parameters:
mon (int, month)
day (int, day)
year (int, year)
- Returns:
julian_day
- Return type:
Julian day as a flt
Examples
>>> pmag.julian(10,20,2000) 2451838
- pmagpy.pmag.kentdev(kappa, beta, n=1000)[source]#
Generate a random draw from a Kent distribution with mean declination of 0 and inclination of 90, elongated along -90 to 90 longitude with a specified kappa and beta.
- Parameters:
kappa (kappa (precision parameter) of the distribution)
beta (beta ellipticity of the contours of equal probability of the distribution)
n (number of samples to redraw)
- Returns:
dec, inc
- Return type:
declination and inclination of random Kent distribution draw
Examples
>>> pmag.kentdev(30,0.2,3) ([249.6338265814872, 243.60784772662754, 273.37935292238103], [74.05222965175194, 80.43784483273899, 82.34979130960458])
- pmagpy.pmag.lowes(data)[source]#
Gets Lowe’s power spectrum from gauss coefficients.
- Parameters:
data (nested list of [[l,m,g,h],...] as from pmag.unpack())
- Returns:
Ls (list of degrees l)
Rs (power at degree l)
- pmagpy.pmag.magic_help(keyhelp)[source]#
Returns a help message for a given magic key.
- Parameters:
keyhelp (str) – key name that the user seeks more information about
- Returns:
Information about the input key
- Return type:
str
Examples
>>> pmag.magic_help('location_url') 'Website URL for the location explicitly'
- pmagpy.pmag.magic_read(infile, data=None, return_keys=False, verbose=False)[source]#
Reads a Magic template file, returns data in a list of dictionaries.
- Parameters:
Required –
- infilethe MagIC formatted tab delimited data file
first line contains ‘tab’ in the first column and the data file type in the second (e.g., measurements, specimen, sample, etc.)
Optional – data : data read in with, e.g., file.readlines()
- Return type:
list of dictionaries, file type
- pmagpy.pmag.magic_read_dict(path, data=None, sort_by_this_name=None, return_keys=False)[source]#
Read a magic-formatted tab-delimited file and returns a dictionary of dictionaries, with this format:
{‘Z35.5a’: {‘specimen_weight’: ‘1.000e-03’, ‘er_citation_names’: ‘This study’, ‘specimen_volume’: ‘’, ‘er_location_name’: ‘’, ‘er_site_name’: ‘Z35.’, ‘er_sample_name’: ‘Z35.5’, ‘specimen_class’: ‘’, ‘er_specimen_name’: ‘Z35.5a’, ‘specimen_lithology’: ‘’, ‘specimen_type’: ‘’}, ….} return data, file_type, and keys (if return_keys is true)
- pmagpy.pmag.magic_write(ofile, Recs, file_type, dataframe=False, append=False)[source]#
Writes out a magic format list of dictionaries to ofile.
- Parameters:
ofile (path to output file)
Recs (list of dictionaries in MagIC format OR pandas dataframe)
file_type (MagIC table type (e.g., specimens))
dataframe (boolean) –
- if True, Recs is a pandas dataframe which must be converted
to a list of dictionaries
append (boolean) – if True, file will be appended to named file
- Returns:
[True,False] (True if successful)
ofile (same as input)
Effects
——-
writes a MagIC formatted file from Recs
- pmagpy.pmag.magic_write_old(ofile, Recs, file_type)[source]#
Writes out a magic format list of dictionaries to ofile.
- Parameters:
ofile (path to output file)
Recs (writes a MagIC formatted file from)
file_type (MagIC table type (e.g., specimens))
Effects
-------
Recs
- pmagpy.pmag.magnetic_lat(inc)[source]#
Calculates the magnetic latitude from inclination.
- Parameters:
inc (single float or array) – inclination value(s)
- Returns:
paleo_lat – magnetic latitude from the given inclination(s)
- Return type:
single float or array
Examples
>>> pmag.magnetic_lat(35) 19.29534273533122
>>> pmag.magnetic_lat([35,60,20]) array([19.29534273533122 , 40.8933946491309 , 10.314104815618196])
- pmagpy.pmag.magsyn(gh, sv, b, date, itype, alt, colat, elong)[source]#
Computes x, y, z, and f for a given date and position, from the spherical harmonic coefficients of the International Geomagnetic Reference Field (IGRF). From Malin and Barraclough (1981), Computers and Geosciences, V.7, 401-405.
- Parameters:
gh (main field values for date (calc. in igrf subroutine))
sv (secular variation coefficients (calc. in igrf subroutine))
b (date of dgrf (or igrf) field prior to required date)
date (Required date in years and decimals of a year (A.D.))
itype (1, if geodetic coordinates are used, 2 if geocentric)
alt (itype = 1 : height above mean sea level in km) – itype = 2 : radial distance from the center of the earth
colat (colatitude in degrees (0 to 180))
elong (east longitude in degrees (0 to 360))
- Returns:
x (north component of the magnetic force in nT)
y (east component of the magnetic force in nT)
z (downward component of the magnetic force in nT)
f (total magnetic force in nT)
note (the coordinate system for x,y, and z is the same as that specified by itype)
- pmagpy.pmag.makelist(List)[source]#
Makes a colon delimited list from List.
- Parameters:
List (any list of strings or numbers)
- Return type:
colon delimited list
Examples
>>> pmag.makelist(["mT","T","Am"]) 'mT:T:Am'
- pmagpy.pmag.mark_dmag_rec(s, ind, data)[source]#
Depreciated 9/14/2022
Edits demagnetization data to mark “bad” points with measurement_flag.
- pmagpy.pmag.measurements_methods3(meas_data, noave, savelast=False)[source]#
Add necessary method codes, experiment names, sequence, etc.
- pmagpy.pmag.merge_recs_headers(recs)[source]#
Take a list of recs [rec1,rec2,rec3….], each rec is a dictionary. make sure that all recs have the same headers.
- pmagpy.pmag.mktk03(terms, seed, G2, G3, G1=-18000.0, verbose=False)[source]#
Generates a list of gauss coefficients drawn from the TK03 distribution.
- Parameters:
terms (int) – number of terms to return
seed (random seed)
G2 (int) – ratio of axial quadrupole term to dipole term
G3 (int) – ratio of axial octupole term to dipole term
G1 (float) – value of the axial dipole, default is -18e3 (in nT)
verbose (default is False)
- Returns:
gh – list of l,m,g,h field model generated by TK03
- Return type:
list
- pmagpy.pmag.open_file(infile, verbose=True)[source]#
Open file and return a list of the file’s lines. Try to use utf-8 encoding, and if that fails use Latin-1.
- Parameters:
infile (str) – full path to file
- Returns:
data – all lines in the file
- Return type:
list
- pmagpy.pmag.orient(mag_azimuth, field_dip, or_con)[source]#
Uses specified orientation convention to convert user supplied orientations to laboratory azimuth and plunge.
- Parameters:
mag_azimuth (float) – orientation of the field orientation arrow with respect to north
field_dip (float) –
- dip (or hade) or field arrow.
if hade, with respect to vertical down if inclination, with respect to horizontal (positive down)
or_con (int) – orientation convention : int
database (Samples are oriented in the field with a "field arrow" and measured in the laboratory with a "lab arrow". The lab arrow is the positive X direction of the right handed coordinate system of the specimen measurements. The lab and field arrows may not be the same. In the MagIC) –
- [1] Standard Pomeroy convention of azimuth and hade (degrees from vertical down)
of the drill direction (field arrow). lab arrow azimuth= sample_azimuth = mag_azimuth; lab arrow dip = sample_dip =-field_dip. i.e. the lab arrow dip is minus the hade.
- [2] Field arrow is the strike of the plane orthogonal to the drill direction,
Field dip is the hade of the drill direction. Lab arrow azimuth = mag_azimuth-90 Lab arrow dip = -field_dip
- [3] Lab arrow is the same as the drill direction;
hade was measured in the field. Lab arrow azimuth = mag_azimuth; Lab arrow dip = 90-field_dip
[4] lab azimuth and dip are same as mag_azimuth, field_dip : use this for unoriented samples too [5] Same as AZDIP convention explained below -
azimuth and inclination of the drill direction are mag_azimuth and field_dip; lab arrow is as in [1] above. lab azimuth is same as mag_azimuth,lab arrow dip=field_dip-90
[6] Lab arrow azimuth = mag_azimuth-90; Lab arrow dip = 90-field_dip
angles (we require the orientation (azimuth and plunge) of the X direction of the measurements (lab arrow). Here are some popular conventions that convert the field arrow azimuth (mag_azimuth in the orient.txt file) and dip (field_dip in orient.txt) to the azimuth and plunge of the laboratory arrow (sample_azimuth and sample_dip in er_samples.txt). The two) –
- [1] Standard Pomeroy convention of azimuth and hade (degrees from vertical down)
of the drill direction (field arrow). lab arrow azimuth= sample_azimuth = mag_azimuth; lab arrow dip = sample_dip =-field_dip. i.e. the lab arrow dip is minus the hade.
- [2] Field arrow is the strike of the plane orthogonal to the drill direction,
Field dip is the hade of the drill direction. Lab arrow azimuth = mag_azimuth-90 Lab arrow dip = -field_dip
- [3] Lab arrow is the same as the drill direction;
hade was measured in the field. Lab arrow azimuth = mag_azimuth; Lab arrow dip = 90-field_dip
[4] lab azimuth and dip are same as mag_azimuth, field_dip : use this for unoriented samples too [5] Same as AZDIP convention explained below -
azimuth and inclination of the drill direction are mag_azimuth and field_dip; lab arrow is as in [1] above. lab azimuth is same as mag_azimuth,lab arrow dip=field_dip-90
[6] Lab arrow azimuth = mag_azimuth-90; Lab arrow dip = 90-field_dip
below. (mag_azimuth and field_dip are explained) –
- [1] Standard Pomeroy convention of azimuth and hade (degrees from vertical down)
of the drill direction (field arrow). lab arrow azimuth= sample_azimuth = mag_azimuth; lab arrow dip = sample_dip =-field_dip. i.e. the lab arrow dip is minus the hade.
- [2] Field arrow is the strike of the plane orthogonal to the drill direction,
Field dip is the hade of the drill direction. Lab arrow azimuth = mag_azimuth-90 Lab arrow dip = -field_dip
- [3] Lab arrow is the same as the drill direction;
hade was measured in the field. Lab arrow azimuth = mag_azimuth; Lab arrow dip = 90-field_dip
[4] lab azimuth and dip are same as mag_azimuth, field_dip : use this for unoriented samples too [5] Same as AZDIP convention explained below -
azimuth and inclination of the drill direction are mag_azimuth and field_dip; lab arrow is as in [1] above. lab azimuth is same as mag_azimuth,lab arrow dip=field_dip-90
[6] Lab arrow azimuth = mag_azimuth-90; Lab arrow dip = 90-field_dip
- Return type:
azimuth and dip of lab arrow
- pmagpy.pmag.parse_site(sample, convention, Z)[source]#
Parse the site name from the sample name using the specified convention
- pmagpy.pmag.pinc(lat)[source]#
Calculate paleoinclination from latitude using dipole formula: tan(I) = 2tan(lat).
- Parameters:
lat (either a single value or an array of latitudes)
- Return type:
array of inclinations
Examples
>>> lats = [45,40,60,80, -30,55] >>> np.round(pmag.pinc(lats),1) array([ 63.4, 59.2, 73.9, 85. , -49.1, 70.7])
- pmagpy.pmag.plat(inc)[source]#
Calculate paleolatitude from inclination using dipole formula: tan(I) = 2tan(lat).
- Parameters:
inc (either a single value or an array of inclinations)
- Return type:
array of latitudes
Examples
>>> incs = [63.4,59.2,73.9,85,-49.1,70.7] >>> np.round(pmag.plat(incs)) array([ 45., 40., 60., 80., -30., 55.])
- pmagpy.pmag.process_data_for_mean(data, direction_type_key)[source]#
Takes a list of dicts with dec and inc as well as direction_type if possible or method_codes and sorts the data into lines and planes and process it for fisher means
- Parameters:
data (list of dicts with dec inc and some manner of PCA type info)
direction_type_key (key that indicates the direction type variable in the dictionaries of data)
- Returns:
tuple with values – number of line list of lists with [EL,EM,EN] of all planes number of planes list of sum of the cartezian components of all lines
- Return type:
list of lists with [dec, inc, 1.] for all lines
- pmagpy.pmag.pseudo(DIs, random_seed=None)[source]#
Draw a bootstrap sample of directions returning as many bootstrapped samples as in the input directions.
- Parameters:
DIs (nested list of dec, inc lists (known as a di_block))
random_seed (set random seed for reproducible number generation (default is None))
- Returns:
Bootstrap_directions (nested list of dec, inc lists that have been)
bootstrapped resampled
Examples
>>> di_block = ([[-45,150], [-40,150], [-38,145]]) >>> pmag.pseudo(di_block,10) array([[-40, 150], [-40, 150], [-45, 150]])
- pmagpy.pmag.pt_rot(EP, Lats, Lons)[source]#
Rotates points on a globe by an Euler pole rotation using method of Cox and Hart 1986, box 7-3.
- Parameters:
EP (Euler pole list [lat,lon,angle] specifying the location of the pole;)
pole (the angle is for a counterclockwise rotation about the)
Lats (list of latitudes of points to be rotated)
Lons (list of longitudes of points to be rotated)
- Returns:
RLats (list of rotated latitudes)
RLons (list of rotated longitudes)
- pmagpy.pmag.read_criteria_from_file(path, acceptance_criteria, **kwargs)[source]#
Read accceptance criteria from magic criteria file # old format: multiple lines. pmag_criteria_code defines the type of criteria
to deal with old format this function reads all the lines and ignore empty cells. i.e., the program assumes that in each column there is only one value (in one of the lines)
- special case in the old format:
specimen_dang has a value and pmag_criteria_code is IE-specimen. The program assumes that the user means specimen_int_dang
# New format for thellier_gui and demag_gui: one long line. pmag_criteria_code=ACCEPT
path is the full path to the criteria file
the function takes exiting acceptance_criteria and updtate it with criteria from file
output: acceptance_criteria={} acceptance_criteria[MagIC Variable Names]={} acceptance_criteria[MagIC Variable Names][‘value’]:
a number for acceptance criteria value -999 for N/A 1/0 for True/False or Good/Bad
- acceptance_criteria[MagIC Variable Names][‘threshold_type’]:
“low”: lower cutoff value i.e. crit>=value pass criteria “high”: high cutoff value i.e. crit<=value pass criteria [string1,string2,….]: for flags
- acceptance_criteria[MagIC Variable Names][‘decimal_points’]:number of decimal points in rounding
(this is used in displaying criteria in the dialog box)
- pmagpy.pmag.resolve_file_name(fname, dir_path='.')[source]#
Parse file name information and output full path. Allows input as: fname == /path/to/file.txt or fname == file.txt, dir_path == /path/to Either way, returns /path/to/file.txt. Used in conversion scripts.
- Parameters:
fname (str) – short filename or full path to file
dir_path (str) – directory, optional
- Returns:
full_file – full path/to/file.txt
- Return type:
str
- pmagpy.pmag.s2a(s)[source]#
Convert 6 element “s” list to 3x3 a matrix (see Tauxe 1998).
- Parameters:
s (six element list of floats)
- Returns:
a
- Return type:
3x3 matrix as an array
Examples
>>> pmag.s2a([1,2,3,4,5,6]) array([[1., 4., 6.], [4., 2., 5.], [6., 5., 3.]], dtype=float32)
- pmagpy.pmag.s_boot(Ss, ipar=0, nb=1000)[source]#
Returns bootstrap parameters for S data.
- Parameters:
Ss (nested array of [[x11 x22 x33 x12 x23 x13],....] data)
ipar (if True, do a parametric bootstrap)
nb (number of bootstraps)
- Returns:
Tmean (average eigenvalues)
Vmean (average eigvectors)
Taus (bootstrapped eigenvalues)
Vs (bootstrapped eigenvectors)
Examples
>>> Ss = [[0.33586472,0.32757074,0.33656454,0.0056526,0.00449771,-0.00036542], [0.33815295,0.32601482,0.33583224,0.00754076,0.00405271,-0.0001627], [0.33806428,0.32925552,0.33268023,0.00480824,-0.00168595,0.0009308], [0.33939844,0.32750368,0.33309788,0.00763409,0.00264978,0.00070303], [0.3348785,0.32816416,0.33695734,0.00574405,0.00278172,-0.00073475], [0.33485019,0.32948497,0.33566481,0.00597801,0.00426423,-0.00040056]] >>> pmag.s_boot(Ss,0,2) ([0.34040287, 0.3353659, 0.32423124], [[29.594002551414974, 14.457521581993113], [166.31028417625646, 70.4972100801602], [296.2343306258123, 12.805665338949966]], [[0.34002233, 0.33413905, 0.32583863], [0.34043044, 0.33551994, 0.32404962]], [[[26.298051965057486, 5.235004519419732], [183.15464080261913, 84.30971842978398], [296.0941733228108, 2.224044816930646]], [[28.798353815000212, 14.686330248560294], [166.21187481069492, 70.40546729047502], [295.4174263407004, 12.681162985818712]]])
- pmagpy.pmag.s_l(l, alpha=27.7)[source]#
Get sigma as a function of degree l from Constable and Parker (1988)
- Parameters:
l (int) – degree of spherical harmonic expansion
alpha (float) – alpha parameter for CP88 model, default is 27.7 in CP88
- Return type:
sigma corresponding to degree l
Examples
>>> pmag.s_l(4) 0.36967732888223936
- pmagpy.pmag.sbar(Ss)[source]#
Calculate average s,sigma from a list of S’s.
- Parameters:
Ss (nested list of lists) – each list is a six element tensors
- Returns:
nf (degrees of freedom)
sigma (sigma of the list)
avs (the original list)
Examples
>>> Ss = [[0.33586472,0.32757074,0.33656454,0.0056526,0.00449771,-0.00036542], [0.33815295,0.32601482,0.33583224,0.00754076,0.00405271,-0.0001627], [0.33806428,0.32925552,0.33268023,0.00480824,-0.00168595,0.0009308], [0.33939844,0.32750368,0.33309788,0.00763409,0.00264978,0.00070303], [0.3348785,0.32816416,0.33695734,0.00574405,0.00278172,-0.00073475], [0.33485019,0.32948497,0.33566481,0.00597801,0.00426423,-0.00040056]] >>> pmag.sbar(Ss) (30, 0.0018030794236146297, [0.33686818, 0.3279989816666667, 0.33513284, 0.0062262916666666656, 0.002760033333333333, -4.933333333333345e-06])
- pmagpy.pmag.sbootpars(Taus, Vs)[source]#
Get bootstrap parameters for s data from bootstrap eigenvalues and eigenvectors.
- Parameters:
Taus (nested list of eigenvalues)
Vs (nested list of eigenvectors)
- Returns:
bpars
- Return type:
dictionary of bootstrap parameters for the bootstrap eigenvalues and eigenvectors.
Examples
>>> Taus = [[0.89332515, 0.2421235, -0.13544868], [1.2330734, 0.033398163, -0.26647156]] >>> Vs = [[[16.71852040881784, 22.059363998317398], [122.30845200565045, 33.55240424468586], [259.90057243022835, 48.06963167162283]], [[36.31805058172574, 15.477280574403938], [183.99811452360234, 71.85809815162672], [303.738439079619, 9.23224775163199]]] >>> pmag.sbootpars(Taus, Vs) {'t1_sigma': 0.24023829147126252, 't2_sigma': 0.1475911011981474, 't3_sigma': 0.09264716693859128, 'v1_dec': 26.711662224665808, 'v1_inc': 19.026277799227568, 'v1_zeta': 24.690888880899667, 'v1_eta': 1.249303736510881e-14, 'v1_zeta_dec': 290.06398627901706, 'v1_zeta_inc': 18.55700265945684, 'v1_eta_dec': 159.07273109103403, 'v1_eta_inc': 62.89733254726475, 'v2_dec': 137.92012533786792, 'v2_inc': 55.87313967394276, 'v2_zeta': 1.250107785929978e-14, 'v2_eta': 75.07258147484707, 'v2_zeta_dec': 20.268001361016218, 'v2_zeta_inc': 17.460349183865556, 'v2_eta_dec': 280.5183204912709, 'v2_eta_inc': 28.297554981599696, 'v3_dec': 286.25118089868266, 'v3_inc': 30.42076774734727, 'v3_zeta': 2.4071834979709793e-14, 'v3_eta': 85.83440673704222, 'v3_zeta_dec': 40.186767906504166, 'v3_zeta_inc': 34.642182695768, 'v3_eta_dec': 166.24042846510952, 'v3_eta_inc': 40.4243181226488}
- pmagpy.pmag.scalc_vgp_df(vgp_df, anti=0, rev=0, cutoff=180.0, kappa=0, n=0, spin=0, v=0, boot=False, mm97=False, nb=1000, verbose=True)[source]#
Calculates Sf for a dataframe with VGP Lat., and optional Fisher’s k, site latitude and N information can be used to correct for within site scatter (McElhinny & McFadden, 1997)
- Parameters:
vgp_df (Pandas Dataframe with columns) – REQUIRED: vgp_lat : VGP latitude ONLY REQUIRED for MM97 correction: dir_k : Fisher kappa estimate dir_n_samples : number of samples per site lat : latitude of the site mm97 : if True, will do the correction for within site scatter OPTIONAL: boot : if True. do bootstrap nb : number of bootstraps, default is 1000
anti (Boolean) – if True, take antipodes of reverse poles
spin (Boolean) – if True, transform data to spin axis
rev (Boolean) – if True, take only reverse poles
v (Boolean) – if True, filter data with Vandamme (1994) cutoff
boot (Boolean) – if True, use bootstrap for confidence 95% interval
mm97 (Boolean) – if True, use McFadden McElhinny 1997 correction for S
nb (int) – number of bootstrapped pseudosamples for confidence estimate
verbose (Boolean) – if True, print messages
- Returns:
N (number of VGPs used in calculation)
S_B (S value)
low (95% confidence lower bound [0 if boot=0])
high (95% confidence upper bound [0 if boot=0])
cutoff (cutoff used in calculation of S)
- pmagpy.pmag.scoreit(pars, PmagSpecRec, accept, text, verbose)[source]#
Depreciated 9/14/2022
Gets a grade for a given set of data, spits out stuff.
- pmagpy.pmag.separate_directions(di_block)[source]#
Separates set of directions into two modes based on principal direction
- Parameters:
di_block (block of nested dec,inc pairs)
- Returns:
mode_1_block,mode_2_block
- Return type:
two arrays of nested dec,inc pairs
- pmagpy.pmag.set_priorities(SO_methods, ask)[source]#
Figure out which sample_azimuth to use, if multiple orientation methods
- pmagpy.pmag.sort_diclist(undecorated, sort_on)[source]#
Sort a list of dictionaries by the value in each dictionary for the sorting key.
- Parameters:
undecorated (list of dicts)
sort_on (str, numeric) – key that is present in all dicts to sort on
- Return type:
Ordered list of dicts
Examples
>>> lst = [{'key1': 10, 'key2': 2}, {'key1': 1, 'key2': 20}] >>> sort_diclist(lst, 'key1') [{'key2': 20, 'key1': 1}, {'key2': 2, 'key1': 10}] >>> sort_diclist(lst, 'key2')
- pmagpy.pmag.sort_magic_data(magic_data, sort_name)[source]#
Sort magic_data by header.
- Parameters:
magic_data (table from a MagIC upload (or downloaded) txt file)
sort_name (str) – name of header to sort by, (‘er_specimen_name’ for example)
- Returns:
magic_data
- Return type:
sorted table by indicated sort_name
- pmagpy.pmag.sortarai(datablock, s, Zdiff, **kwargs)[source]#
Sorts data block in to first_Z, first_I, etc.
- Parameters:
datablock (Pandas DataFrame with Thellier-Tellier type data)
s (specimen name)
Zdiff (if True, take difference in Z values instead of vector difference) – NB: this should always be False
**kwargs – version : data model. if not 3, assume data model = 2.5
- Returns:
araiblock ([first_Z, first_I, ptrm_check,) – ptrm_tail, zptrm_check, GammaChecks]
field (lab field (in tesla))
- pmagpy.pmag.sortmwarai(datablock, exp_type)[source]#
sorts microwave double heating data block in to first_Z, first_I, etc.
- pmagpy.pmag.sortshaw(s, datablock)[source]#
Sorts data block in to ARM1,ARM2 NRM,TRM,ARM1,ARM2=[],[],[],[] stick first zero field stuff into first_Z
- pmagpy.pmag.squish(incs, f)[source]#
Returns ‘flattened’ inclination, assuming factor, f and King (1955) formula: tan (I_o) = f tan (I_f)
- Parameters:
incs (array of inclination (I_f) data to flatten)
f (flattening factor)
- Returns:
I_o
- Return type:
inclinations after flattening
- pmagpy.pmag.tauV(T)[source]#
Gets the eigenvalues (tau) and eigenvectors (V) from 3x3 matrix T.
- Parameters:
T (3x3 matrix)
- Returns:
t (eigenvalues for the given matrix (T))
V (eigenvectors for the given matrix (T))
Examples
>>> T = [[2,4,6], [10,2,5], [1,7,8]] >>> pmag.tauV(T) ([(1.2709559412652764+0j), (-0.13547797063263817+0.11627030078868397j), (-0.13547797063263817-0.11627030078868397j)], [array([0.473150982577391+0.j, 0.600336609447566+0.j, 0.644766704353637+0.j]), array([-0.006695867252108+0.161305398937403j, 0.801217123829199+0.j , -0.567608562961462-0.09903218351161j ]), array([-0.006695867252108-0.161305398937403j, 0.801217123829199-0.j , -0.567608562961462+0.09903218351161j ])])
- pmagpy.pmag.tcalc(nf, p)[source]#
T-table for nf degrees of freedom (95% confidence).
- Parameters:
nf (degrees of freedom)
p (either 0.05 or 0.01)
- Return type:
t value or 0 if given an invalid p value
Examples
>>> pmag.tcalc(8,0.05) 2.3646
>>> pmag.tcalc(8,0.07) 0
>>> pmag.tcalc(8,0.01) 3.4995
- pmagpy.pmag.unpack(gh)[source]#
Unpacks gh list into l m g h type list.
- Parameters:
gh (list of gauss coefficients (as returned by, e.g., doigrf))
- Returns:
data
- Return type:
nested list of [[l,m,g,h],…]
Examples
>>> gh = pmag.doigrf(30,70,10,2022,coeffs=True) >>> pmag.unpack(gh) [[1, 0, -29404.8, 0], [1, 1, -1450.9, 4652.5], [2, 0, -2499.6, 0], [2, 1, 2982.0, -2991.6], [2, 2, 1677.0, -734.6], [3, 0, 1363.2, 0], [3, 1, -2381.2, -82.1], [3, 2, 1236.2, 241.9], [3, 3, 525.7, -543.4], ...
- pmagpy.pmag.unsquish(incs, f)[source]#
Returns ‘unflattened’ inclination, assuming factor, f and King (1955) formula: tan (I_o) = tan (I_f)/f.
incs : array of inclination (I_f) data to unflatten f : flattening factor
I_o : array of inclinations after unflattening
>>> incs = [63.4,59.2,73.9,85,-49.1,70.7] >>> np.round(pmag.unsquish(incs,.5),1) array([ 75.9, 73.4, 81.8, 87.5, -66.6, 80.1])
>>> incs=np.loadtxt('data_files/unsquish/unsquish_example.dat') >>> pmag.unsquish(incs,.5) array([[-19.791612533135584, 38.94002937796913 ], [ 3.596453939529656, 35.75555908297152 ], [ 11.677464698445519, 27.012196299111633], [ 0.399995126240053, 46.27631997468994 ], [ 46.760422847350405, 39.080596252430965], [ 48.64708345693855 , 37.07969161240791 ],
…
- pmagpy.pmag.upload_read(infile, table)[source]#
Depreciated 9/14/2022
Reads a table from a MagIC upload (or downloaded) txt file and puts data in a list of dictionaries.
- pmagpy.pmag.vdm_b(vdm, lat)[source]#
Converts a virtual dipole moment (VDM) or a virtual axial dipole moment (VADM) to a local magnetic field value
- Parameters:
vdm (V(A)DM in units of Am^2)
lat (latitude of site in degrees)
- Returns:
B
- Return type:
local magnetic field strength in tesla
Examples
>>> pmag.vdm_b(65, 20) 2.9215108300460446e-26
- pmagpy.pmag.vector_mean(data)[source]#
Calculates the vector mean of a given set of vectors.
- Parameters:
data (nested array of [dec,inc,intensity])
- Returns:
dir (array of [dec, inc, 1])
R (resultant vector length)
Examples
>>> data=np.loadtxt('data_files/vector_mean/vector_mean_example.dat') >>> Dir,R=pmag.vector_mean(data) >>> data.shape[0],Dir[0],Dir[1],R (100, 1.2702459152657795, 49.62123008281823, 2289431.9813831896)
>>> data = np.array([[16.0, 43.0, 21620.33], [30.5, 53.6, 12922.58], [6.9, 33.2, 15780.08], [352.5, 40.2, 33947.52], [354.2, 45.1, 19725.45]]) >>> pmag.vector_mean(data) (array([ 3.875568482416763, 43.02570375878505 , 1.]), 102107.93048882612)
- pmagpy.pmag.vfunc(pars_1, pars_2)[source]#
Calculate the Watson Vw test statistic. Calculated as 2*(Sw-Rw)
- Parameters:
pars_1 (dictionary of Fisher statistics from population 1)
pars_2 (dictionary of Fisher statistics from population 2)
- Returns:
Vw
- Return type:
Watson’s Vw statistic
- pmagpy.pmag.vgp_di(plat, plong, slat, slong)[source]#
Converts a pole position (pole latitude, pole longitude) to a direction (declination, inclination) at a given location (slat, slong) assuming a dipolar field.
- Parameters:
plat (latitude of pole (vgp latitude))
plong (longitude of pole (vgp longitude))
slat (latitude of site)
slong (longitude of site)
- Returns:
dec,inc
- Return type:
tuple of declination and inclination
- pmagpy.pmag.vocab_convert(vocab, standard, key='')[source]#
Converts MagIC database terms (method codes, geologic_types, etc) to other standards. May not be comprehensive for each standard. Terms added to standards as people need them and may not be up-to-date.
‘key’ can be used to distinguish vocab terms that exist in two different lists.
- Return type:
value of the MagIC vocab in the standard requested
Examples
>>> pmag.vocab_convert('Egypt','GEOMAGIA') '1'
- pmagpy.pmag.vspec(data)[source]#
Depreciated 9/14/2022
Takes the vector mean of replicate measurements at a given step.
- pmagpy.pmag.watsonsV(Dir1, Dir2)[source]#
Calculates Watson’s V statistic for two sets of directions
- pmagpy.pmag.watsons_f(DI1, DI2)[source]#
Calculates Watson’s F statistic (equation 11.16 in Essentials text book).
- Parameters:
DI1 (nested array of [Dec,Inc] pairs)
DI2 (nested array of [Dec,Inc] pairs)
- Returns:
F (Watson’s F)
Fcrit (critical value from F table)
Examples
>>> D1= [[-45,150],[-40,150],[-38,145]] >>> D2= [[-43,140],[-39,130],[-38,145]] >>> pmag.watsons_f(D1,D2) (3.7453156915587567, 4.459)
- pmagpy.pmag.weighted_mean(data)[source]#
Calculates the weighted mean of data.
- Parameters:
data (array of data)
- Returns:
mean (mean of the data as a float)
stdev (standard deviation of the data as a float)
Examples
>>> data = np.array([ [16.0, 43.0, 33], [30.5, 53.6, 58], [6.9, 33.2, 8], [352.5, 40.2, 52], [354.2, 45.1, 45]]) >>> pmag.weighted_mean(data) (152.00743840074387, 81.7174866362813)