Jupyter notebook demonstrating the use of additional PmagPy functions

This Jupyter notebook demonstrates a number of PmagPy functions within a notebook environment running a Python 2.7 kernel. The benefits of working within these notebooks include: reproducibility, interactive code development, convenient workspace for projects, version control (when integrated with GitHub or other version control software) and ease of sharing. The other example PmagPy notebook in this repository (Example_PmagPy_Notebook.ipynb in https://github.com/PmagPy/2016_Tauxe-et-al_PmagPy_Notebooks which can also be seen here: http://pmagpy.github.io/Example_PmagPy_Notebook.html) includes additional instructions on how to get started using PmagPy in the notebook. That example notebook contains a more extended work flow on two data sets while this notebook contains numerous code vignettes to illustrate additional available functionality within the PmagPy module.

The notebook was created by Luke Fairchild and Nicholas Swanson-Hysell and accompanies a paper entitled:

PmagPy: Software package for paleomagnetic data analysis and a bridge to the Magnetics Information Consortium (MagIC) Database

L. Tauxe, R. Shaar, L. Jonestrask, N.L. Swanson-Hysell, R. Minnett, A.A.P., Koppers, C.G. Constable, N. Jarboe, K. Gaastra, and L. Fairchild

Contents of the notebook

Basic function examples

Paleomagnetic data analysis examples

Paleomagnetic poles plotting examples

Rock magnetism data analysis and visualization examples

Additional Features of the Jupyter Notebook

Note: This notebook makes use of additional scientific Python modules: pandas for reading, displaying, and using data with a dataframe structure, numpy array computations and matplotlib for plotting. These modules come standard with scientific computing distributions of Python or can be installed separately as needed.

Import necessary function libraries for data analysis

The code block below imports the pmagpy module from PmagPy that provides functions that will be used in the data analysis. At present, the most straight-forward way to do so is to install the pmagpy module using the pip package manager by executing this command at the command line:

pip install pmagpy

Approachs not using pip can also work. One way would be to download the pmagpy folder from the main PmagPy repository and either put it in the same directory as the notebook or put it anywhere on your local machine and add a statement such as export PYTHONPATH=$PYTHONPATH:~/PmagPy in your .profile or .bash_profile file that points to where PmagPy is on your local machine (in this example in the home directory).

With PmagPy available in this way, the function modules from PmagPy can be imported: pmag, a module with ~160 (and growing) functions for analyzing paleomagnetic and rock magnetic data and ipmag, a module with functions that combine and extend pmag functions and exposes pmagplotlib functions in order to generate output that works well within the Jupyter notebook environment.

To execute the code, click on play button in the menu bar, choose run under the 'cell' menu at the top of the notebook, or type shift+enter.

In [1]:
import pmagpy.ipmag as ipmag
import pmagpy.pmag as pmag

There are three other important Python libraries (which are bundled with the Canopy and Anaconda installations of Python) that come in quite handy and are used within this notebook: numpy for data analysis using arrays, pandas for data manipulation within dataframes and matplotlib for plotting. The call %matplotlib inline results in the plots being shown within this notebook rather than coming up in external windows.

In [2]:
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
#import os
%matplotlib inline
%config InlineBackend.figure_formats = {'svg',}

Some of the plots below utilize the Basemap package which enables the plotting of data on a variety of geographic projections. Basemap is not a standard part of most scientific python distributions so you may need to take extra steps to install it. If using the Anaconda distribution, you can type conda install basemap at the command line. The Enthought Canopy distribution has a GUI package manager that you can use for installing the package although a Canopy subscription (free for academic users) may be necessary for installation. This code cell can be skipped if you don't have Basemap installed as it is only necessary for a few of the code vignettes.

In [3]:
from mpl_toolkits.basemap import Basemap

The dipole equation

The following demonstrates the use of a simple function (ipmag.lat_from_inc) which uses the dipole equation to return expected latitude from inclination data as predicted by a pure geocentric axial dipole. The expected inclination for the geomagnetic field can be calculated from a specified latitude using ipmag.inc_from_lat.

In [4]:
inclination = range(0,90,1)
latitude = []
for inc in inclination:
    lat = ipmag.lat_from_inc(inc)
In [5]:

Get local geomagnetic field estimate from IGRF

The function ipmag.igrf uses the International Geomagnetic Reference Field (IGRF) model to estimate the geomagnetic field direction at a particular location and time. Let's find the direction of the geomagnetic field in Berkeley, California (37.87° N, 122.27° W, elevation of 52 m) on August 27, 2013 (in decimal format, 2013.6544).

In [6]:
berk_igrf = ipmag.igrf([2013.6544, .052, 37.871667, -122.272778])
Declination: 13.950
Inclination: 61.354
Intensity: 13.950 nT

Plotting directions

We can plot this direction using matplotlib (plt) in conjunction with a few ipmag functions. To do this, we first initiate a figure (numbered as Fig. 0, with a size of 6x6) with the following syntax:


We then draw an equal area stereonet within the figure, specifying the figure number:


Now we can plot the direction we just pulled from IGRF using ipmag.plot_di():


To label or color the plotted points, we would pass the same code as above with a few extra arguments and one additional line of code:

ipmag.plot_di(berk_igrf[0],berk_igrf[1], color='r', label="Berkeley, CA -- August 27, 2013")

We may wish to save the figure we just created. To do so, we would pass the following save function, specifying: 1) the relative path to the folder where we want the figure to be saved and 2) the name of the file with the desired extension (.pdf in this example):


To ensure the figure is displayed properly and then cleared from the namespace, it is good practice to end such a code block with the following:


Now let's run the code we just developed.

In [7]:
ipmag.plot_di(berk_igrf[0],berk_igrf[1], color='r', label="Berkeley, CA -- August 27, 2013")

Let's see how this magnetic direction compares to the Geocentric Axial Dipole (GAD) model of the geomagnetic field. We can estimate the expected GAD inclination by passing Berkeley's latitude to the function ipmag.inc_from_lat.

We also demonstrate below how to manipulate the placement of the figure legend to ensure no data points are obscured. plt.legend uses the "best" location by default, but this can be changed with the following:

plt.legend(loc="upper right")


plt.legend(loc="lower center")

See the plt.legend documentation for the complete list of placement options. Alternatively, you can give (x,y) coordinates to the loc= keyword argument (with the origin (0,0) at the lower left of the figure). To manipulate placement more precisely, use the keyword bbox_to_anchor in conjunction with loc. If this is done, loc becomes the anchor point on the legend, and bbox_to_anchor places this anchor point at the specified coordinates. The latter method is demonstrated below. Play around with the plt.legend arguments to see how this changes things.

In [8]:
GAD_inc = ipmag.inc_from_lat(37.87)
ipmag.plot_di(berk_igrf[0],berk_igrf[1], color='r', label="Berkeley, CA -- August 27, 2013 (IGRF)")
ipmag.plot_di(0,GAD_inc, color='b', label="Berkeley, CA (GAD)")
plt.legend(loc='center left', bbox_to_anchor=(1.0, 0.5))

Below, we calculate the angular difference between these two directions.

Calculate the angle between directions

While ipmag functions have been optimized to perform tasks within an interactive computing environment such as the Jupyter notebook, the pmag functions which are used extensively within ipmag can also be directly called. Here is a demonstration of the function pmag.angle, which calculates the angle between two directions and outputs a numpy array. Continuing our comparison from the last section, let's calculate the angle between the IGRF and GAD-estimated magnetic directions calculated and plotted above.

In [9]:
direction1 = [berk_igrf[0],berk_igrf[1]]
direction2 = [0,GAD_inc]
print pmag.angle(direction1,direction2)[0]

Generate and plot Fisher distributed unit vectors from a specified distribution

Let's use the function ipmag.fishrot to generate a set of 50 Fisher-distributed directions at a declination of 200° and inclination of 45°. These directions will serve as an example paleomagnetic dataset that will be used for the next several examples. The output from ipmag.fishrot is a nested list of lists of vectors [declination, inclination, intensity]. Generally these vectors are unit vectors with an intensity of 1.0. We refer to this data structure as a di_block. In the code below the first two vectors are shown.

In [10]:
fisher_directions = ipmag.fishrot(k=40, n=50, dec=200, inc=50)
[[192.83513262821486, 48.345178352843504, 1.0],
 [196.52060188859477, 45.216123128952503, 1.0]]

This di_block can be unpacked in separate lists of declination and inclination using the ipmag.unpack_di_block function.

In [11]:
fisher_decs, fisher_incs = ipmag.unpack_di_block(fisher_directions)
print fisher_decs[0]
print fisher_incs[0]

Another way to deal with the di_block is to make it into a pandas dataframe which allows for the direction to be nicely displayed and analyzed. In the code below, a dataframe is made from the fisher_directions di_block and then the first 5 rows are displayed with .head().

In [12]:
directions = pd.DataFrame(fisher_directions,columns=['dec','inc','length'])
dec inc length
0 192.835133 48.345178 1
1 196.520602 45.216123 1
2 197.175873 52.220636 1
3 197.746394 49.758821 1
4 193.273778 44.069669 1

Now let's calculate the Fisher and Bingham means of these data.

In [13]:
fisher_mean = ipmag.fisher_mean(directions.dec,directions.inc)
bingham_mean = ipmag.bingham_mean(directions.dec,directions.inc)

Here's the raw output of the Fisher mean which is a dictionary containing the mean direction and associated statistics:

In [14]:
{'alpha95': 3.3508355923555615,
 'csd': 13.295057898998207,
 'dec': 198.2819634243472,
 'inc': 51.51476485197729,
 'k': 37.118427710472233,
 'n': 50,
 'r': 48.679900981199815}

The function ipmag.print_direction_mean prints formatted output from this Fisher mean dictionary:

In [15]:
Dec: 198.3  Inc: 51.5
Number of directions in mean (n): 50
Angular radius of 95% confidence (a_95): 3.4
Precision parameter (k) estimate: 37.1

Now we can plot all of our data using the function ipmag.plot_di. We can also plot the Fisher mean with its angular radius of 95% confidence ( $\alpha_{95}$ ) using ipmag.plot_di_mean.

In [16]:
declinations = directions.dec.tolist()
inclinations = directions.inc.tolist()


Flip polarity of directional data

Let's flip all the directions (find their antipodes) of the Fisher-distributed population using the function ipmag.do_flip() function and plot the resulting directions.

In [17]:
# get reversed directions
dec_reversed,inc_reversed = ipmag.do_flip(declinations,inclinations)

# take the Fisher mean of these reversed directions
rev_mean = ipmag.fisher_mean(dec_reversed,inc_reversed)

# plot the flipped directions
ipmag.plot_di(dec_reversed, inc_reversed)

Test directional data for Fisher distribution

The function ipmag.fishqq tests whether directional data are Fisher-distributed. Let's use this test on the random Fisher-distributed directions we just created (it should pass!).

In [18]:
ipmag.fishqq(declinations, inclinations)
{'Dec': 198.25239777790117,
 'Inc': 51.470606976470293,
 'Me': 0.72251289446940592,
 'Me_critical': 1.094,
 'Mode': 'Mode 1',
 'Mu': 1.1600945148437338,
 'Mu_critical': 1.207,
 'N': 50,
 'Test_result': 'consistent with Fisherian model'}

Flattening and unflattening directional data

Inclination flattening can occur for magnetizations in sedimentary rocks. We can simulate inclination error of a specified "flattening factor" with the function ipmag.squish. Flattening factors range from 0 (completely flattened) to 1 (no flattening). Let's squish our directions with a 0.4 flattening factor.

In [19]:
# squish all inclinations
squished_incs = []
for inclination in inclinations:
    squished_incs.append(ipmag.squish(inclination, 0.4))

# plot the squished directional data
squished_DIs = np.array(zip(declinations,squished_incs))
In [20]:
{'alpha95': 4.205181611462165,
 'csd': 16.558322028145675,
 'dec': 198.98671467582687,
 'inc': 27.605625253249691,
 'k': 23.929707418862442,
 'n': 50,
 'r': 47.952336017222841}

We can also "unsquish" data by a specified flattening factor. Let's unsquish the data we squished above with the function ipmag.unsquish. Using a flattening factor of 0.4 will restore the data to its original state.

In [21]:
unsquished_incs = []
for squished_inc in squished_incs:
    unsquished_incs.append(ipmag.unsquish(squished_inc, 0.4))

# plot the squished directional data

Estimating inclination flattening using the E/I method

When dealing with a set of flattened data, it is necessary to develop an estimate of how much the data have been flattened. One approach that can be taken using PmagPy is the elongation-inclination (E/I) method of Tauxe and Kent (2004). The code block below simulates directions from the TK03 secular variation model using the ipmag.tk03 function.

In [28]:
directions_tk03 = ipmag.tk03(n=200,lat=45)
dec_tk03, inc_tk03 = ipmag.unpack_di_block(directions_tk03)
directions_tk03_mean = ipmag.fisher_mean(dec_tk03, inc_tk03)

ipmag.plot_di(dec_tk03, inc_tk03)
Dec: 1.0  Inc: 61.2
Number of directions in mean (n): 200
Angular radius of 95% confidence (a_95): 2.3
Precision parameter (k) estimate: 19.6

The code block below flattens this simulated data set with a flattening factor of 0.3.

In [29]:
inc_tk03_squished = []
for n in range(len(inc_tk03)):
    inc_tk03_squished.append(ipmag.squish(inc_tk03[n], 0.3))
ipmag.plot_di(dec_tk03, inc_tk03_squished)

These flattened data are then passed to the ipmag.find_ei function which follows the same work flow as the command line find_ei.py program. The function tries different flattening factors and "unsquishes" inclinations to find the flattening factor that gives an elongation/inclination pair consistent with the TK03 secular variation model. This function should return a flattening factor of 0.3 and an inclination that matches the unflattened inclination of the initially simulated data.

In [30]:
directions_tk03_squished = ipmag.make_di_block(dec_tk03,inc_tk03_squished)

Bootstrapping.... be patient

The original inclination was: 31.113744196

The corrected inclination is: 59.7354648602
with bootstrapped confidence bounds of: 49.8050514758 to 67.5346713064
and elongation parameter of: 1.45962963799

Bootstrap reversal test

This code uses the ipmag.fishrot function to simulate normal directions and reversed directions from antipodal Fisher distributions. It then conducts a bootstrap reversal test (Tauxe, 2010; ipmag.reversal_test_bootstrap) to test if two populations are antipodal to one another (which they should be!)

In [31]:
normal_directions = ipmag.fishrot(k=20,n=40,dec=30,inc=45)
reversed_directions = ipmag.fishrot(k=20,n=30,dec=210,inc=-45)
combined_directions = normal_directions + reversed_directions

Here are the results of the bootstrap test for a common mean:

McFadden and McElhinny (1990) reversal test

Another reversal test that can be applied to these data is the McFadden and McElhinny (1990) reversal test (ipmag.reversal_test_MM1990). This test is an adaptation of the Watson V test for a common mean.

In [32]:
                           plot_CDF=True, plot_stereo=True, 
                           save=True, save_folder= './Additional_Notebook_Output/')
Results of Watson V test: 

Watson's V:           4.0
Critical value of V:  6.2
"Pass": Since V is less than Vcrit, the null hypothesis
that the two populations are drawn from distributions
that share a common mean direction can not be rejected.

M&M1990 classification:

Angle between data set means: 6.2
Critical angle for M&M1990:   7.6
The McFadden and McElhinny (1990) classification for
this test is: 'B'

Working with poles

A variety of plotting functions within PmagPy, together with the Basemap package of matplotlib, provide a great way to work with paleomagnetic poles, virtual geomagnetic poles, and polar wander paths.

In order to get a sense of what the most basic structure of this code block should look like, let's start by manually inputting the data for two random poles.

In [33]:
# initiate figure and specify figure size
plt.figure(figsize=(5, 5))

# initiate a Basemap projection, specifying the latitude and
# longitude (lat_0 and lon_0) at which our figure is centered.
pmap = Basemap(projection='ortho',lat_0=30,lon_0=320,
# other optional modifications to the globe figure

# Here we plot a pole at 340 E longitude, 30 N latitude with an
# alpha 95 error angle of 5 degrees. Keyword arguments allow us
# to specify the label, shape, and color of this data.
ipmag.plot_pole(pmap,340,30,5,label='VGP examples',

# We can plot multiple poles sequentially on the same globe using
# the same plot_pole function.

# Optional save (uncomment to save the figure)