<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Micah J Waldstein</title>
    <link>/</link>
    <description>Recent content on Micah J Waldstein</description>
    <generator>Hugo -- gohugo.io</generator>
    <language>en-us</language>
    <copyright>&amp;copy; 2010 - 2018</copyright>
    <lastBuildDate>Wed, 20 Apr 2016 00:00:00 -0400</lastBuildDate>
    <atom:link href="/" rel="self" type="application/rss+xml" />
    
    <item>
      <title>The DartCannon Technology Stack</title>
      <link>/2018/08/the-dartcannon-technology-stack/</link>
      <pubDate>Thu, 09 Aug 2018 00:00:00 +0000</pubDate>
      
      <guid>/2018/08/the-dartcannon-technology-stack/</guid>
      <description>

&lt;p&gt;Today we want to pull back the curtain on how we built DartCannon and share our
appreciation on all the projects that have helped us get here.&lt;/p&gt;

&lt;h2 id=&#34;the-core&#34;&gt;The Core&lt;/h2&gt;

&lt;p&gt;The foundational piece of DartCannon is our custom-build proprietary Monte Carlo
simulation engine, &lt;strong&gt;Thompson&lt;/strong&gt;. Under development in some form for over 8 years,
Thompson is what enables us to iterate quickly and provide such high quality
simulations at an affordable price point. Over the development of DartCannon
every other piece of technology has been swapped out for alternatives and
likely will be again, but Thompson has been the constant we&amp;rsquo;ve built around.&lt;/p&gt;

&lt;h2 id=&#34;power-players&#34;&gt;Power Players&lt;/h2&gt;

&lt;p&gt;With Thompson powering the core of DartCannon, these are the components which
drive the main user experience.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://firebase.google.com&#34; target=&#34;_blank&#34;&gt;Firebase&lt;/a&gt; - Our database and authentication, firebase allows
 us to not worry about servers or infrastructure.&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://stripe.com&#34; target=&#34;_blank&#34;&gt;Stripe&lt;/a&gt; - For payments and subscriptions, stripe means we can be
 confident that security is all taken care of.&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://vuejs.org&#34; target=&#34;_blank&#34;&gt;Vue&lt;/a&gt; - There are a lot of javascript frameworks, Vue happens to be
 the one our team knew and was comfortable using to build DartCannon.&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://vuetify.com&#34; target=&#34;_blank&#34;&gt;Vuetify&lt;/a&gt; - Vuetify is a fantastic material design framework for
 Vue.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&#34;supporting-roles&#34;&gt;Supporting Roles&lt;/h2&gt;

&lt;p&gt;While not as central as our Power Players, these pieces all play important
parts driving very specific features.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;http://www.chartjs.org/&#34; target=&#34;_blank&#34;&gt;Chartjs&lt;/a&gt; - Powers all the pretty confidence intervals&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/dagrejs/dagre&#34; target=&#34;_blank&#34;&gt;Dagre&lt;/a&gt; - Enables process diagrams for projects&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://github.com/guyonroche/exceljs&#34; target=&#34;_blank&#34;&gt;exceljs&lt;/a&gt; - The underlying tech for our excel import / export.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&#34;extras&#34;&gt;Extras&lt;/h2&gt;

&lt;p&gt;In addition to these components, we have a huge pile of additional pieces we
depend on for testing, error handling, screen shots, and more. They are too
numerous and detailed for this overview, but we owe them a lot.&lt;/p&gt;

&lt;h2 id=&#34;thanks&#34;&gt;Thanks&lt;/h2&gt;

&lt;p&gt;DartCannon would not exist without the rich open source ecosystem and we are
deeply indebted to all of the projects mentioned here.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Photo by Chris Yang &lt;a href=&#34;https://unsplash.com/photos/6yDPni7ueSk&#34; target=&#34;_blank&#34;&gt;Unsplash&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>R Directional 3.3 Package is Out</title>
      <link>/2018/08/r-directional-3.3-package-is-out/</link>
      <pubDate>Wed, 08 Aug 2018 00:00:00 +0000</pubDate>
      
      <guid>/2018/08/r-directional-3.3-package-is-out/</guid>
      <description>&lt;p&gt;A few weeks late, but worth calling out that
&lt;a href=&#34;https://cran.r-project.org/web/packages/Directional/&#34; target=&#34;_blank&#34;&gt;Directional 3.3 is out on CRAN&lt;/a&gt;. I had a small contribution which
came out of my investigations into &lt;a href=&#34;/blog/2018/06/introduction-to-spherical-densities-in-r/&#34;&gt;spherical&lt;/a&gt; &lt;a href=&#34;/external/rsphericaldensity/posts/heatmap/&#34;&gt;densities&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I modified the respective posts to use the native Directional &lt;code&gt;vmf.kerncontour&lt;/code&gt;
rather than the previously unreleased modification I made to return the data.&lt;/p&gt;

&lt;p&gt;Thanks to maintainer Michail Tsagris for accepting the patch and
all his help.&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>DartCannon: Week In Review 7/13/2018</title>
      <link>/2018/07/dartcannon-week-in-review-7/13/2018/</link>
      <pubDate>Fri, 13 Jul 2018 00:00:00 +0000</pubDate>
      
      <guid>/2018/07/dartcannon-week-in-review-7/13/2018/</guid>
      <description>

&lt;p&gt;&lt;em&gt;Every week on &lt;a href=&#34;https://dartcannon.com/blog/2018-accepting-an-estimate&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; we give a rundown on the week including what we&amp;rsquo;ve
written, are reading, and major imporvements. I won&amp;rsquo;t cross-post all of them,
but will put them up every so often.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Happy Friday! Here&amp;rsquo;s a rundown of this week in &lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt;&lt;/p&gt;

&lt;h2 id=&#34;what-we-wrote&#34;&gt;What We Wrote&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://dartcannon.com/blog/2018-accepting-an-estimate&#34; target=&#34;_blank&#34;&gt;Accepting an Estimate&lt;/a&gt; - DartCannon is about creating simulations of
complex business problems, but once the simulation is done the work of
getting buy in starts. We shared some pointers of how to approach those
conversations and potential pitfalls to look out for.&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://dartcannon.com/blog/2018-the-curse-of-parallel-tasks&#34; target=&#34;_blank&#34;&gt;The Curse of Parallel Tasks&lt;/a&gt; - It is so easy for our intuition to lead
us astray as problems get complex. Here we show how you can be fooled even
in the simple case of tasks with equal estimates performed in parallel.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&#34;what-we-ve-been-reading&#34;&gt;What We&amp;rsquo;ve Been Reading&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;http://www.johngoodpasture.com/2018/07/if-you-only-know-one-thing-about-risk.html&#34; target=&#34;_blank&#34;&gt;If you only know one thing about Risk Management&amp;hellip;&lt;/a&gt;
(J. Goodpasture, &lt;em&gt;Musings on project management&lt;/em&gt;) John is always good at
bringing complex ideas back to fundamentals. Evidentially coupling of
projects is on everyone&amp;rsquo;s mind, as we wrote about &lt;a href=&#34;https://dartcannon.com/blog/2018-the-curse-of-parallel-tasks&#34; target=&#34;_blank&#34;&gt;parallel tasks&lt;/a&gt;, John
wrote about the importance of schedule slack where it is unavoidable.&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://www.pmi.org/learning/library/monte-carlo-analysis-product-development-7862&#34; target=&#34;_blank&#34;&gt;Real-world Monte Carlo analysis&lt;/a&gt;
(Cook, M. S., &lt;em&gt;Paper presented at Project Management Institute Annual
Seminars &amp;amp; Symposium, Nashville, TN&lt;/em&gt;) It likely comes as no surprise that
we&amp;rsquo;re slightly geeky around here, so no one should be shocked that we read
this conference paper from 2001. Great if you&amp;rsquo;re looking for a more rigorous
review of the application of monte-carlo analysis, the simulation method
underlying DartCannon.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&#34;dartcannon-change-spotlight&#34;&gt;DartCannon Change Spotlight&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;We are constantly continuing to improve DartCannon and while the majority of
changes are too small to call out, we want to highlight larger improvements.&lt;/em&gt;&lt;/p&gt;

&lt;h3 id=&#34;excel-import-enhancements&#34;&gt;Excel Import Enhancements&lt;/h3&gt;

&lt;p&gt;In addition to overall performance improvements to XLSX import &lt;em&gt;and&lt;/em&gt; export,
we&amp;rsquo;ve added some smarts to the import process. Now, DartCannon will
automatically attempt to identify which sheet contains your items and risks AND
try to identify what each column represents.&lt;/p&gt;

&lt;p&gt;Excel import/export is a Pro feature, so you need to &lt;a href=&#34;https://dartcannon.com/subscribe&#34; target=&#34;_blank&#34;&gt;subscribe&lt;/a&gt; to take
advantage.&lt;/p&gt;

&lt;h2 id=&#34;an-end-of-week-quote&#34;&gt;An end-of-week quote:&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;A goal without a plan is just a wish - Antoine de Saint-Exupéry&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Hope everyone has a great weekend!&lt;/p&gt;

&lt;p&gt;See something you think we&amp;rsquo;d like to share? Send an email to
weekly@dartcannon.com or connect on twitter, &lt;a href=&#34;https://twitter.com/dartcannon&#34; target=&#34;_blank&#34;&gt;@dartcannon&lt;/a&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>edgarWebR 1.0</title>
      <link>/2018/07/edgarwebr-1.0/</link>
      <pubDate>Tue, 03 Jul 2018 00:00:00 +0000</pubDate>
      
      <guid>/2018/07/edgarwebr-1.0/</guid>
      <description>

&lt;p&gt;I recently realized that &lt;a href=&#34;https://mwaldstein.github.io/edgarWebR/&#34; target=&#34;_blank&#34;&gt;edgarWebR&lt;/a&gt; 1.0 was released a while ago without much
fanfare. 1.0 is a major milestone for the library, bringing the full set of
(initial) planned functionality along with some bonus features.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Headline features:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;100%&lt;/strong&gt; coverage of &lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/webusers.htm&#34; target=&#34;_blank&#34;&gt;SEC search tools&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Parsing of submissions into component files and 10-x filings into items and
 parts.&lt;/li&gt;
&lt;li&gt;A dataset of SIC mappings&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;What&amp;rsquo;s Next:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bugfixes - corner cases keep popping up that need fixing&lt;/li&gt;
&lt;li&gt;Parsing Improvements - I have some ideas about table handling that will help
 anyone interested in getting data out of older filings&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&#34;edgar-tools&#34;&gt;EDGAR Tools&lt;/h2&gt;

&lt;p&gt;The EDGAR System provides a number of &lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/webusers.htm&#34; target=&#34;_blank&#34;&gt;tools&lt;/a&gt; for filing and entity lookup and examination. As of v1.0, edgarWebR supports all public search and browse interfaces.&lt;/p&gt;

&lt;h3 id=&#34;search-interfaces&#34;&gt;Search Interfaces&lt;/h3&gt;

&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Tool&lt;/th&gt;
&lt;th&gt;URL&lt;/th&gt;
&lt;th&gt;edgarWebR function(s)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;

&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Company&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/companysearch.html&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/companysearch.html&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;company_search()&lt;/code&gt;, &lt;code&gt;company_information()&lt;/code&gt;, &lt;code&gt;company_details()&lt;/code&gt;, &lt;code&gt;company_filings()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Recent Filings&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/cgi-bin/browse-edgar?action=getcurrent&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/cgi-bin/browse-edgar?action=getcurrent&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;latest_filings()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Full Text&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;http://searchwww.sec.gov/EDGARFSClient/jsp/EDGAR_MainAccess.jsp&#34; target=&#34;_blank&#34;&gt;http://searchwww.sec.gov/EDGARFSClient/jsp/EDGAR_MainAccess.jsp&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;full_text()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Header Search&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/cgi-bin/srch-edgar&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/cgi-bin/srch-edgar&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;header_search()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Fund Disclosures&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/prospectus.htm&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/prospectus.htm&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Use &lt;code&gt;company_search()&lt;/code&gt; and specify the &amp;lsquo;type&amp;rsquo; parameter as 485&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Fund Voting Records&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/n-px.htm&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/n-px.htm&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Use &lt;code&gt;company_search()&lt;/code&gt; and specify the &amp;lsquo;type&amp;rsquo; parameter as &amp;lsquo;N-PX&amp;rsquo;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Fund Search&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/mutualsearch.html&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/mutualsearch.html&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;fund_search()&lt;/code&gt;, &lt;code&gt;fund_fast_search()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Var. Insurance Products&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/vinsurancesearch.html&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/vinsurancesearch.html&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;variable_insurance_search()&lt;/code&gt;, &lt;code&gt;variable_insurance_fast_search()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Confidential treatment orders&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/ctorders.htm&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/ctorders.htm&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Use &lt;code&gt;header_search()&lt;/code&gt;, &lt;code&gt;company_search()&lt;/code&gt;, &lt;code&gt;latest_filings()&lt;/code&gt;, or &lt;code&gt;full_text()&lt;/code&gt; and use form types &amp;lsquo;CT ORDER&amp;rsquo;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Effectiveness notices&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/cgi-bin/browse-edgar?action=geteffect&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/cgi-bin/browse-edgar?action=geteffect&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;effectiveness()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;CIK&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/cik.htm&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/cik.htm&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;cik_search()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Daily Filings&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/edgar/searchedgar/currentevents.htm&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/edgar/searchedgar/currentevents.htm&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;&lt;code&gt;current_events()&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;

&lt;tr&gt;
&lt;td&gt;Correspondence&lt;/td&gt;
&lt;td&gt;&lt;a href=&#34;https://www.sec.gov/answers/edgarletters.htm&#34; target=&#34;_blank&#34;&gt;https://www.sec.gov/answers/edgarletters.htm&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Use &lt;code&gt;header_search()&lt;/code&gt;, &lt;code&gt;company_search()&lt;/code&gt;, &lt;code&gt;latest_filings()&lt;/code&gt;, or &lt;code&gt;full_text()&lt;/code&gt; and use form types &amp;lsquo;upload&amp;rsquo; or &amp;lsquo;corresp&amp;rsquo;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;

&lt;p&gt;Once a filing is found via any of the above, there are a number of functions to process the result -&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;filing_documents()&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;filing_filers()&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;filing_funds()&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;filing_information()&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;filing_details()&lt;/code&gt; - returns all 4 of the filing components in a list.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&#34;parsing-tools&#34;&gt;Parsing Tools&lt;/h3&gt;

&lt;p&gt;While edgarWebR is primarily focused on providing an interface to the online SEC tools, there are a few activities for handling filing documents for which no current tools exist.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;parse_submission()&lt;/code&gt; - takes a full submission SGML document and parses out
component documents. Most of the time, the documents of interest in a
particular submission will be online and accessible via &lt;code&gt;filing_documents()&lt;/code&gt; -
this function is to unpack the raw submission to get all the documents. You
may also find it more efficient if you&amp;rsquo;re regularly downloading all of the
files in a given submission.&lt;/li&gt;
&lt;li&gt;&lt;code&gt;parse_filing()&lt;/code&gt; - Takes a HTML narrative filing and annotates each paragraph
with item and part numbers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3 id=&#34;data-sets&#34;&gt;Data Sets&lt;/h3&gt;

&lt;p&gt;There is one dataset provided with edgarWebR, &lt;code&gt;sic_codes&lt;/code&gt;, providing a catalog of SIC codes and their hierarchy.&lt;/p&gt;

&lt;h3 id=&#34;url-tools&#34;&gt;URL Tools&lt;/h3&gt;

&lt;p&gt;There are also a number of utility functions to help construct useful URL&amp;rsquo;s once you have a company CIK, submission accession number or specific file.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;company_href()&lt;/code&gt; for linking to the company page&lt;/li&gt;
&lt;li&gt;&lt;code&gt;submission_index_href()&lt;/code&gt; and its family of related functions for linking to a specific submission and file.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&#34;installation&#34;&gt;Installation&lt;/h2&gt;

&lt;p&gt;edgarWebR is available from CRAN, so can be simply installed via&lt;/p&gt;

&lt;pre&gt;&lt;code class=&#34;language-r&#34;&gt;install.packages(&amp;quot;edgarWebR&amp;quot;)
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;To install the development version,&lt;/p&gt;

&lt;pre&gt;&lt;code class=&#34;language-r&#34;&gt;# Install the development version from GitHub:
# install.packages(&amp;quot;devtools&amp;quot;)
devtools::install_github(&amp;quot;mwaldstein/edgarWebR&amp;quot;)
&lt;/code&gt;&lt;/pre&gt;

&lt;h2 id=&#34;contributing&#34;&gt;Contributing&lt;/h2&gt;

&lt;p&gt;If you&amp;rsquo;ve found this package helpful, contributions are always appreciated at
the &lt;a href=&#34;https://github.com/mwaldstein/edgarWebR/&#34; target=&#34;_blank&#34;&gt;page on github&lt;/a&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bug Reports&lt;/li&gt;
&lt;li&gt;Code improvements&lt;/li&gt;
&lt;li&gt;Documentation&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I&amp;rsquo;m also always excited to hear about how the package is being used. If this
package has helped you in any way, drop me a note!&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Heatmaps of Spherical Densities in R</title>
      <link>/external/rsphericaldensity/posts/heatmap/</link>
      <pubDate>Fri, 15 Jun 2018 21:13:14 -0500</pubDate>
      
      <guid>/external/rsphericaldensity/posts/heatmap/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://micah.waldste.in/blog/2018/06/introduction-to-spherical-densities-in-r/&#34;&gt;Last time&lt;/a&gt; we made contour maps of densities of points on a globe, now it is time to take another step and make heatmaps. We created all the data we needed when creating the contours, but heatmaps add new challenges of dealing with large amounts of raster and polygon data. Lets get to it.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;DISCLAIMER: While I know a thing or two, there’s a reasonable chance I got some things wrong or at very least there are certainly more efficient ways to go about things. Feedback always appreciated!&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;NOTE: This has been updated to use the native &lt;code&gt;vmf.kerncontour&lt;/code&gt; since the release of Directional 3.3 supports returning results&lt;/em&gt;&lt;/p&gt;
&lt;div id=&#34;set-up&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Set-Up&lt;/h1&gt;
&lt;p&gt;First, we’ll make use of a number of libraries and setup our plotting environment:&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;library(rgdal)       # For coordinate transforms
library(sp)          # For plotting grid images
library(sf)
library(lwgeom)
library(Directional) # For spherical density functions
library(spData)      # worldmap
library(raster)
library(magick)      # for animating
par(bg = NA)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;We’ll also use the same &lt;code&gt;vmf_density_grid&lt;/code&gt; function we introduced in the Intro post.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;vmf_density_grid &amp;lt;- function(u, ngrid = 100) {
  # Translate to (0,180) and (0,360)
  u[,1] &amp;lt;- u[,1] + 90
  u[,2] &amp;lt;- u[,2] + 180
  res &amp;lt;- vmf.kerncontour(u, thumb = &amp;quot;none&amp;quot;, den.ret = T, full = T,
                             ngrid = ngrid)

  # Translate back to (-90, 90) and (-180, 180) and create a grid of
  # coordinates
  ret &amp;lt;- expand.grid(Lat = res$lat - 90, Long = res$long - 180)
  ret$Density &amp;lt;- c(res$den)
  ret
}&lt;/code&gt;&lt;/pre&gt;
&lt;div id=&#34;global-earthquakes-again&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Global Earthquakes Again&lt;/h2&gt;
&lt;p&gt;Global Earthquakes from &lt;a href=&#34;http://www.ncedc.org/anss/catalog-search.html&#34;&gt;Northern California Earthquake Data Center&lt;/a&gt; is a great dataset we’ll continue to use, so we start with a set of quakes since Jan 1, 1950 of magnitude 5.9 or higher.&lt;/p&gt;
&lt;p&gt;For all our heatmaps, we’ll start the same as we did for contours, calculating the density map:&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;grid.size = 100
earthquakes &amp;lt;- read.csv(file.path(&amp;quot;..&amp;quot;, &amp;quot;data&amp;quot;, &amp;quot;earthquakes.csv&amp;quot;))
earthquake.densities &amp;lt;- vmf_density_grid(earthquakes[,c(&amp;quot;Latitude&amp;quot;,
                                                        &amp;quot;Longitude&amp;quot;)],
                                         ngrid = grid.size)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Once we have the densities, we need to coerce them into a spatial format - in this case we’ll create a &lt;code&gt;SpatialGridDataFrame&lt;/code&gt;, matching the grid of densities we calculated with &lt;code&gt;vmf_density_grid&lt;/code&gt;.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;density_matrix &amp;lt;- matrix(earthquake.densities$Density, nrow = grid.size)
density_matrix &amp;lt;- t(apply(density_matrix, 2, rev))
gridVals &amp;lt;- data.frame(att=as.vector(density_matrix))
gt &amp;lt;- GridTopology(cellcentre.offset = c(-180 + 180 / grid.size,
                                         -90 + 90 / grid.size),
                   cellsize = c( 360 / grid.size, 180 / grid.size),
                   cells.dim = c(grid.size, grid.size))
sGDF &amp;lt;- SpatialGridDataFrame(gt,
                             data = gridVals,
                             proj = &amp;quot;+proj=longlat +datum=WGS84 +no_defs&amp;quot;)

plot(sGDF)
plot(gridlines(sGDF), add = TRUE, col = &amp;quot;grey30&amp;quot;, alpha = .1)
plot(st_geometry(world), add = TRUE, col = NA, border = &amp;quot;grey&amp;quot;)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/heatmap_files/figure-html/earthquake_plot-1.png&#34; width=&#34;672&#34; /&gt;&lt;/p&gt;
&lt;p&gt;Great, we have a heatmap! But it is in rectangular coordinates, we want to project it to something nicer, like a Winkel triple. There’s a problem though… We can’t just re-project our SpatialGridDataFrame - it gets interpolated into points, losing our nice pretty smooth heatmap.&lt;/p&gt;
&lt;p&gt;There are two real options for us:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Convert to raster data, then project the raster&lt;/li&gt;
&lt;li&gt;Convert to raster, convert to polygons, project the polygons&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;div id=&#34;projecting-raster-data&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Projecting Raster Data&lt;/h2&gt;
&lt;p&gt;This is really slow, so we have to turn the resolution way down.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;r &amp;lt;- raster(sGDF)
crs1 &amp;lt;- &amp;quot;+proj=wintri&amp;quot;
world.crs1 &amp;lt;- st_transform_proj(world, crs = crs1)

pr1 &amp;lt;- projectExtent(r, crs1)
res(pr1) &amp;lt;- 9e5
pr2 &amp;lt;- projectRaster(r, pr1, method = &amp;quot;bilinear&amp;quot;, over = TRUE)
plot(pr2)
plot(st_geometry(world.crs1), add = TRUE, col = NA, border = &amp;quot;grey&amp;quot;)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/heatmap_files/figure-html/earthquake_proj_raster-1.png&#34; width=&#34;672&#34; /&gt;&lt;/p&gt;
&lt;p&gt;I guess this works, but the low resolution suggests we can do better.&lt;/p&gt;
&lt;/div&gt;
&lt;div id=&#34;using-polygons&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Using Polygons&lt;/h2&gt;
&lt;p&gt;We’ll use raster data again, but we’ll immediately convert it into a grid of square polygons which we can then project&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;r2 &amp;lt;- raster(sGDF)
# We&amp;#39;ll manually colorize
r2 &amp;lt;- cut(r2,
          pretty(r2[], 50),
          include.lowest = F)
color.vals &amp;lt;- rev(terrain.colors(50))
pol &amp;lt;- rasterToPolygons(r2)
crs1 &amp;lt;- &amp;quot;+proj=wintri&amp;quot;
world.crs1 &amp;lt;- st_transform_proj(world, crs = crs1)
pol.crs1 &amp;lt;- spTransform(pol, crs1)
par(mar = c(0, 0, 0, 0))
plot(pol.crs1, col=color.vals[r2[]], border = NA)
# plot(gridlines(sgdf.crs1), add = TRUE, col = &amp;quot;grey30&amp;quot;, alpha = .1)
plot(st_geometry(world.crs1), add = TRUE, col = NA, border = &amp;quot;grey&amp;quot;)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/heatmap_files/figure-html/earthquakes_projected-1.png&#34; width=&#34;672&#34; /&gt;&lt;/p&gt;
&lt;p&gt;Now that looks good!&lt;/p&gt;
&lt;p&gt;One thing to keep in mind however - because our polygons are rectangular in equal coordinates, they will warp and distort as a projection gets more severe. In our animation, you can see what I mean&lt;/p&gt;
&lt;/div&gt;
&lt;div id=&#34;animating&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Animating&lt;/h2&gt;
&lt;p&gt;We’re projecting into an orthographic projection to simulate the rotating globe. A few things you’ll see in the code where I jump through hoops:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Cropping the top&lt;/strong&gt; - If I leave the top polygons in place, they bunch up in an ugly fashion&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Making features valid&lt;/strong&gt; - Both for the world and our heatmap polygons I jump through hoops to make sure only valid polygons get through to the final plot.&lt;/li&gt;
&lt;/ul&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;r3 &amp;lt;- raster(sGDF)

# Crop down because projecting the poles causes problems
r.crop &amp;lt;- res(r3)
rc &amp;lt;- crop(r3, extent(-180, 180,
                      -90 + r.crop[2], 90 - r.crop[2]))
pol &amp;lt;- rasterToPolygons(rc)
pol.breaks &amp;lt;- pretty(pol$att, 20)
pol.colors &amp;lt;- rev(terrain.colors(length(pol.breaks) - 1))
# Make the lowest color transparent
substr(pol.colors[1], 8, 9) &amp;lt;- &amp;quot;00&amp;quot;

n.frames &amp;lt;- 30
img &amp;lt;- image_graph(400, 400, res = 96)
par(mar = c(0, 0, 0, 0))
grad &amp;lt;- st_graticule(ndiscr = 1e4)
for (i in 1:n.frames) {
  long &amp;lt;- -180 + (i - 1) * 360 / n.frames
  crs.ani &amp;lt;- paste0(&amp;quot;+proj=ortho +lat_0=0 +lon_0=&amp;quot;, long)
  grad.ani &amp;lt;- st_geometry(st_transform(grad, crs.ani))

  world.ani &amp;lt;- st_transform(st_geometry(world), crs = crs.ani)

  # For some reason this stopped working,
  # Not including results in more countries not rendering properly
  # world.ani &amp;lt;- lwgeom::st_make_valid(world.ani)

  # We don&amp;#39;t want the points
  world.ani &amp;lt;- world.ani[st_geometry_type(world.ani) %in% c(&amp;#39;POLYGON&amp;#39;,
                                                            &amp;#39;MULTIPOLYGON&amp;#39;)]

  # There are inevitable some bad polygons out of the transform
  world.ani &amp;lt;- world.ani[st_is_valid(world.ani)]

  pol.ani &amp;lt;- st_transform(as(pol, &amp;quot;sf&amp;quot;), crs.ani)
  pol.ani.geo &amp;lt;- lwgeom::st_make_valid(pol.ani)
  pol.ani.geo &amp;lt;- pol.ani.geo[st_geometry_type(pol.ani.geo) %in% c(&amp;#39;POLYGON&amp;#39;,
                                                                  &amp;#39;MULTIPOLYGON&amp;#39;,
                                                                  &amp;#39;GEOMETRYCOLLECTION&amp;#39;), ]
  pol.ani.geo &amp;lt;- pol.ani.geo[st_is_valid(pol.ani.geo), ]
  pol.ani.geo &amp;lt;- pol.ani.geo[!st_is_empty(pol.ani.geo), ]

  plot(grad.ani, col = &amp;quot;black&amp;quot;)
  plot(world.ani, add = TRUE, col = &amp;quot;grey30&amp;quot;, border = &amp;quot;grey&amp;quot;)
  plot(pol.ani.geo, border = NA, breaks = pol.breaks, pal = pol.colors,
       add = TRUE, main = NA, key.pos = NULL)
}
msg &amp;lt;- dev.off()
image_animate(img, fps = 10)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/heatmap_files/figure-html/earthquake_ani-1.gif&#34; /&gt;&lt;!-- --&gt;&lt;/p&gt;
&lt;p&gt;Looks pretty good, but we do have some interesting world map problems with countries popping out as they reach the edge… Something to investigate another day.&lt;/p&gt;
&lt;/div&gt;
&lt;div id=&#34;final-notes&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Final Notes&lt;/h2&gt;
&lt;p&gt;In both these examples we’ve used global data as it shows the problems of using “traditional” density estimators, but the same issue exists at all scales. It is just a question of when a simpler approximation is reasonable.&lt;/p&gt;
&lt;p&gt;You can also see a bit of blockiness which we could reduce with an increase in grid size, but that will be very dependent on need.&lt;/p&gt;
&lt;p&gt;Next, some real data…&lt;/p&gt;
&lt;p&gt;If you want to explore the code yourself, everything is on &lt;a href=&#34;https://github.com/mwaldstein/rSphericalDensity&#34;&gt;github here&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div id=&#34;appendix&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Appendix&lt;/h1&gt;
&lt;div id=&#34;spherical-density-function&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Spherical Density Function&lt;/h2&gt;
&lt;p&gt;This calculates a grid of densities which can then be used with &lt;code&gt;geom_contour&lt;/code&gt;. The code basically comes directly from &lt;a href=&#34;https://rdrr.io/cran/Directional/man/vmf.kerncontour.html&#34;&gt;Directional’s vmf.kerncontour&lt;/a&gt;, only returning a data.frame instead of actually plotting the output.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;vmf.kerncontour.new &amp;lt;- function(u, thumb = &amp;quot;none&amp;quot;, ret.all = FALSE, full = FALSE,
                            ngrid = 100) {
  ## u contains the data in latitude and longitude
  ## the first column is the latitude and the
  ## second column is the longitude
  ## thumb is either &amp;#39;none&amp;#39; (default), or &amp;#39;rot&amp;#39; (Garcia-Portugues, 2013)
  ## ret.all if set to TRUE returns a matrix with latitude, longitude and density
  ## full if set to TRUE calculates densities for the full sphere, otherwise
  ##   using extents of the data
  ## ngrid specifies the number of points taken at each axis
  n &amp;lt;- dim(u)[1]  ## sample size
  x &amp;lt;- euclid(u)

  if (thumb == &amp;quot;none&amp;quot;) {
    h &amp;lt;- as.numeric( vmfkde.tune(x, low = 0.1, up = 1)[1] )
  } else if (thumb == &amp;quot;rot&amp;quot;) {
    k &amp;lt;- vmf(x)$kappa
    h &amp;lt;- ( (8 * sinh(k)^2) / (k * n * ( (1 + 4 * k^2) * sinh(2 * k) -
    2 * k * cosh(2 * k)) ) ) ^ ( 1/6 )
  }

  if (full) {
    x1 &amp;lt;- seq( 0, 180, length = ngrid )  ## latitude
    x2 &amp;lt;- seq( 0, 360, length = ngrid )  ## longitude
  } else {
    x1 &amp;lt;- seq( min(u[, 1]) - 5, max(u[, 1]) + 5, length = ngrid )  ## latitude
    x2 &amp;lt;- seq( min(u[, 2]) - 5, max(u[, 2]) + 5, length = ngrid )  ## longitude
  }
  cpk &amp;lt;- 1 / (  ( h^2)^0.5 *(2 * pi)^1.5 * besselI(1/h^2, 0.5) )
  mat &amp;lt;- matrix(nrow = ngrid, ncol = ngrid)

  for (i in 1:ngrid) {
    for (j in 1:ngrid) {
      y &amp;lt;- euclid( c(x1[i], x2[j]) )
      a &amp;lt;- as.vector( tcrossprod(x, y / h^2) )
      can &amp;lt;- sum( exp(a + log(cpk)) ) / ngrid
      if (abs(can) &amp;lt; Inf)   mat[i, j] &amp;lt;- can
    }
  }

  if (ret.all) {
    return(list(Lat = x1, Long = x2, h = h, d = mat))
  } else {
    contour(mat$Lat, mat$Long, mat, nlevels = 10, col = 2, xlab = &amp;quot;Latitude&amp;quot;,
            ylab = &amp;quot;Longitude&amp;quot;)
    points(u[, 1], u[, 2])
  }
}&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;
&lt;div id=&#34;references&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Earthquake data was accessed through the &lt;a href=&#34;http://www.ncedc.org/anss/catalog-search.html&#34;&gt;Northern California Earthquake Data Center (NCEDC)&lt;/a&gt;, &lt;a href=&#34;doi:10.7932/NCEDC&#34; class=&#34;uri&#34;&gt;doi:10.7932/NCEDC&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>Introduction to Spherical Densities in R</title>
      <link>/blog/2018/06/introduction-to-spherical-densities-in-r/</link>
      <pubDate>Sun, 10 Jun 2018 21:13:14 -0500</pubDate>
      
      <guid>/blog/2018/06/introduction-to-spherical-densities-in-r/</guid>
      <description>&lt;p&gt;It always happens… I get interested in what I think will be a small data project to scratch some itch and end up down a deep rabbit hole. In this case, a passing interest in the geographic distribution of some samples (more on that in a future post) led to a deep dive into spherical distributions and densities.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;DISCLAIMER: While I know a thing or two, there’s a reasonable chance I got some things wrong or at very least there are certainly more efficient ways to go about things. Feedback always appreciated!&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;em&gt;NOTE: This has been updated to use the native &lt;code&gt;vmf.kerncontour&lt;/code&gt; since the release of Directional 3.3 supports returning results&lt;/em&gt;&lt;/p&gt;
&lt;div id=&#34;motivation&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Motivation&lt;/h1&gt;
&lt;p&gt;While I got interested in figuring out densities for the purpose of figuring out the density of points on a map, there are plenty of other cases where you might be interested in the distribution of points on a sphere. The trouble is that most functions commonly available, e.g. &lt;code&gt;geom_density_2d&lt;/code&gt; from ggplot2, only handles regular grid coordinates.&lt;/p&gt;
&lt;p&gt;The forms:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Global densities simply fail at the ‘edge’ of coordinates - e.g. near the poles or near +/- 180 degrees longitude.&lt;/li&gt;
&lt;li&gt;Projection issues. On small scales and near the equator, it is generally safe to make the simplification that longitude/latitude forms a square grid. As you look to larger scales and close to the poles, that assumption breaks down.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;I think it is important to point out that there are many tutorials on plotting event densities on maps (e.g. crime occurrences), but that these are all at the city level, where the problems of using existing methods is a reasonable approximation.&lt;/p&gt;
&lt;/div&gt;
&lt;div id=&#34;set-up&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Set-Up&lt;/h1&gt;
&lt;p&gt;First, we’ll make use of a number of libraries and setup our plotting environment:&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;library(ggplot2)     # For most of our plotting
library(cowplot)     # grid arrangement of plots
library(Directional) # For spherical density functions
library(maps)        # vector maps of the world
library(hrbrthemes)  # hrbrmstr themes
library(magick)      # For animation
library(mapproj)     # Needed for projection

# And set some theme defaults
theme_set(theme_ipsum())
# Axis settings we&amp;#39;ll reuse a lot
no.axis &amp;lt;- theme(axis.ticks.y = element_blank(), axis.text.y = element_blank(),
                 axis.ticks.x = element_blank(), axis.text.x = element_blank(),
                 axis.title.x = element_blank(), axis.title.y = element_blank())&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Next, for this example, we’ll be using a random blob placed on a sphere. I’ll use the &lt;code&gt;rvmf&lt;/code&gt; function from the Directional package. Directional is a general purpose library using Latitude defined from 0 to 180 degrees and Longitude from 0 to 360 instead of -90 to 90 and -180 to 180 respectively. The &lt;code&gt;random_points&lt;/code&gt; function here gives us points in a coordinate system we’re used to.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;random_points &amp;lt;- function(n_points, lat, lon, concentration) {
  # Directional defines lat + long as 0-180 and 0-360 respectively so we
  # have to shift back and forth
  mu &amp;lt;- euclid(c(lat + 90, lon + 180))[1,]
  pts &amp;lt;- euclid.inv(rvmf(n_points, mu, concentration))
  pts[,1] &amp;lt;- pts[,1] - 90
  pts[,2] &amp;lt;- pts[,2] - 180
  data.frame(pts)
}&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;
&lt;div id=&#34;problem&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Problem&lt;/h1&gt;
&lt;p&gt;To visualize the problem, we’ll create 2 sets of points, one centered on the map, the other near the pole and near 180 degrees. We’ll then plot the contours of the densities to show the issue.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;offset.pos &amp;lt;- list(Lat = 75, Long = 175)
positions.center &amp;lt;- random_points(1000, 0, 0, 10)
positions.offset &amp;lt;- random_points(1000, offset.pos$Lat, offset.pos$Long, 10)
plot.colors &amp;lt;- hcl(h = c(0:3)*90, c = 50 , l = 70)
g.base &amp;lt;- ggplot(positions.center, aes(x = Long, y = Lat)) +
          scale_y_continuous(breaks = (-2:2) * 30, limits = c(-90, 90)) +
          scale_x_continuous(breaks = (-4:4) * 45, limits = c(-180, 180)) +
          coord_map()

g.broken &amp;lt;- g.base +
     # The centered random points
     geom_density_2d(color = plot.colors[1]) +
     geom_point(size = 0.5, stroke = 0, color = plot.colors[1]) +
     # The offset random points
     geom_density_2d(data = positions.offset, color = plot.colors[2]) +
     geom_point(data = positions.offset, size = 0.5, stroke = 0,
                color = plot.colors[2])

ortho.projections &amp;lt;- plot_grid(
  g.broken + coord_map(&amp;quot;ortho&amp;quot;, orientation = c(0, 0, 0)) + no.axis,
  g.broken + coord_map(&amp;quot;ortho&amp;quot;, orientation = c(offset.pos$Lat, offset.pos$Long, 0))
           + no.axis,
  labels = NULL,
  align = &amp;#39;h&amp;#39;)
g.broken
ortho.projections&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/intro_files/figure-html/problem-1.png&#34; width=&#34;672&#34; /&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/intro_files/figure-html/problem-2.png&#34; width=&#34;672&#34; /&gt;&lt;/p&gt;
&lt;p&gt;We can quickly see the problem looking at the blue offset density plot - there are multiple “centers” and the contours don’t connect cleanly.&lt;/p&gt;
&lt;/div&gt;
&lt;div id=&#34;spherical-densities&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Spherical Densities&lt;/h1&gt;
&lt;p&gt;The solution is to use spherical densities an fortunately, the Directional package provides functions for spherical (and in fact, circular and spheres of arbitrary dimensions) distributions using the &lt;a href=&#34;https://en.wikipedia.org/wiki/Von_Mises%E2%80%93Fisher_distribution&#34;&gt;von Mises-Fisher distribution&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Our basic approach will be the following steps:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Calculate a “grid” of densities manually, covering the entire globe&lt;/li&gt;
&lt;li&gt;Use geom_contour to turn those density maps into contour curves&lt;/li&gt;
&lt;li&gt;Plot away!&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Before we fix the problem using spherical densities, we first need to do some setup. We’ll be using &lt;code&gt;vmf.kerncontour&lt;/code&gt; from the Directional library, but in current CRAN version (3.2), that function plots contours itself. We want to get the data to perform the plots ourselves, so we need a version that returns the data. The next version of the package will have that option, but in the meantime we put the code for the revised function in the Appendix as &lt;code&gt;vmf.kerncontour.new&lt;/code&gt;.&lt;/p&gt;
&lt;p&gt;Similar to what we did for &lt;code&gt;random_points&lt;/code&gt;, we also need to perform some translation of &lt;code&gt;vmf.kerncontour&lt;/code&gt;’s input and output to out more familiar formats.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;vmf_density_grid &amp;lt;- function(u, ngrid = 100) {
  # Translate to (0,180) and (0,360)
  u[,1] &amp;lt;- u[,1] + 90
  u[,2] &amp;lt;- u[,2] + 180
  res &amp;lt;- vmf.kerncontour(u, thumb = &amp;quot;none&amp;quot;, den.ret = T, full = T,
                             ngrid = ngrid)

  # Translate back to (-90, 90) and (-180, 180) and create a grid of
  # coordinates
  ret &amp;lt;- expand.grid(Lat = res$lat - 90, Long = res$long - 180)
  ret$Density &amp;lt;- c(res$den)
  ret
}&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Now we can go ahead and calculate the densities and plot the contours. We’ll keep the “bad” contours for comparison.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;densities.center &amp;lt;- vmf_density_grid(positions.center)
densities.offset &amp;lt;- vmf_density_grid(positions.offset)

g.broken &amp;lt;- g.base +
     geom_density_2d(color = plot.colors[1], alpha = .5) +
     geom_point(size = 0.5, stroke = 0, color = plot.colors[1], alpha = .5) +
     geom_density_2d(data = positions.offset, color = plot.colors[2], alpha = .5) +
     geom_point(data = positions.offset, size = 0.5, stroke = 0, color =
                plot.colors[2], alpha = .5)

g.densities &amp;lt;- g.broken +
  geom_contour(data = densities.center,
               aes(x=Long, y=Lat, z=Density),
               color = plot.colors[3]) +
  geom_contour(data = densities.offset,
               aes(x=Long, y=Lat, z=Density),
               color = plot.colors[4])

ortho.projections &amp;lt;- plot_grid(
  g.densities + coord_map(&amp;quot;ortho&amp;quot;, orientation = c(0, 0, 0)) + no.axis,
  g.densities + coord_map(&amp;quot;ortho&amp;quot;,
                          orientation = c(offset.pos$Lat, offset.pos$Long, 0))
              + no.axis,
  labels = NULL,
  align = &amp;#39;h&amp;#39;)
g.densities
ortho.projections&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/intro_files/figure-html/fixed_densitites-1.png&#34; width=&#34;672&#34; /&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/intro_files/figure-html/fixed_densitites-2.png&#34; width=&#34;672&#34; /&gt;&lt;/p&gt;
&lt;p&gt;Particularly looking at the orthographic plots, it is easy to see that the spherical density process gives the same rings in both locations, with continuous curves.&lt;/p&gt;
&lt;div id=&#34;practical-example-global-earthquakes&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Practical Example: Global Earthquakes&lt;/h2&gt;
&lt;p&gt;Earthquake density is used in one of the few existing attempts to perform density calculations with spherical coordiates on &lt;a href=&#34;https://www.r-bloggers.com/circular-or-spherical-data-and-density-estimation/&#34;&gt;R-Bloggers&lt;/a&gt;. The &lt;a href=&#34;http://www.ncedc.org/anss/catalog-search.html&#34;&gt;Northern California Earthquake Data Center&lt;/a&gt; provides an archive of earthquakes for download, so we start with a set of quakes since Jan 1, 1950 of magnitude 5.9 or higher. Given that data, we then follow the same process as we did with our random data to plot both the 2d density contours and the density contours using spherical functions.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;earthquakes &amp;lt;- read.csv(file.path(&amp;quot;..&amp;quot;, &amp;quot;data&amp;quot;, &amp;quot;earthquakes.csv&amp;quot;))
earthquake.densities &amp;lt;- vmf_density_grid(earthquakes[,c(&amp;quot;Latitude&amp;quot;,
                                                        &amp;quot;Longitude&amp;quot;)],
                                         ngrid = 300)&lt;/code&gt;&lt;/pre&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;world &amp;lt;- map_data(&amp;quot;world&amp;quot;)
g.earthquakes &amp;lt;- ggplot() +
  geom_map(data = world, map = world,
           mapping = aes(map_id = region),
           color = &amp;quot;grey90&amp;quot;, fill = &amp;quot;grey80&amp;quot;) +
  geom_point(data = earthquakes,
             mapping = aes(x = Longitude, y = Latitude),
             color = &amp;quot;red&amp;quot;, alpha = .2, size = .5, stroke = 0) +
  geom_density_2d(data = earthquakes,
                  aes(x=Longitude, y=Latitude),
                  color = plot.colors[2], alpha = 1) +
  geom_contour(data = earthquake.densities, aes(x=Long, y=Lat, z=Density),
               color = plot.colors[4]) +
  scale_y_continuous(breaks = (-2:2) * 30, limits = c(-90, 90)) +
  scale_x_continuous(breaks = (-4:4) * 45, limits = c(-180, 180)) +
  coord_map(&amp;quot;mercator&amp;quot;)

g.earthquakes&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/intro_files/figure-html/earthquake_plot-1.png&#34; width=&#34;672&#34; /&gt;&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;n.frames &amp;lt;- 40
img &amp;lt;- image_graph(400, 400, res = 96)
for (i in 1:n.frames) {
  long &amp;lt;- 170 + (i - 1) * 360 / n.frames
  # We Explicitly use the &amp;#39;plot&amp;#39; command to show the ggplot
  print(g.earthquakes + coord_map(&amp;quot;ortho&amp;quot;, orientation = c(0, long, 0)) + no.axis)
}
msg &amp;lt;- dev.off()
image_animate(img, fps = 10)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;&lt;img src=&#34;/external/rSphericalDensity/posts/intro_files/figure-html/earthquake_ani-1.gif&#34; /&gt;&lt;!-- --&gt;&lt;/p&gt;
&lt;p&gt;The yellow shows default 2d density, and you can again see the continuity problems. The blue shows the expected &lt;a href=&#34;https://en.wikipedia.org/wiki/Ring_of_fire&#34;&gt;Ring of Fire&lt;/a&gt; thanks to the spherical density. It isn’t perfect - if we were really interested in the most accurate results, we’d probably want to turn up the grid size to better follow the chains of quakes or tweak the contour breakpoints to see the fine features.&lt;/p&gt;
&lt;p&gt;This should be a good first step to looking at densities in geo events.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div id=&#34;next&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Next&lt;/h1&gt;
&lt;p&gt;While this should have given a good introduction to densities on a sphere and the issues with using the default density functions, there is still more we can do. We’ve got a few more posts coming:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Heatmaps&lt;/strong&gt; - Working with heatmaps means generating raster data and projections with raster data adds more complexity&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;More Real Examples&lt;/strong&gt; - I mentioned I had an actual project I was curious about, right?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;If you want to explore the code yourself, everything is on &lt;a href=&#34;https://github.com/mwaldstein/rSphericalDensity&#34;&gt;github here&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id=&#34;appendix&#34; class=&#34;section level1&#34;&gt;
&lt;h1&gt;Appendix&lt;/h1&gt;
&lt;div id=&#34;spherical-density-function&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Spherical Density Function&lt;/h2&gt;
&lt;p&gt;This calculates a grid of densities which can then be used with &lt;code&gt;geom_contour&lt;/code&gt;. The code basically comes directly from &lt;a href=&#34;https://rdrr.io/cran/Directional/man/vmf.kerncontour.html&#34;&gt;Directional’s vmf.kerncontour&lt;/a&gt;, only returning a data.frame instead of actually plotting the output.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;vmf.kerncontour.new &amp;lt;- function(u, thumb = &amp;quot;none&amp;quot;, ret.all = FALSE, full = FALSE,
                            ngrid = 100) {
  ## u contains the data in latitude and longitude
  ## the first column is the latitude and the
  ## second column is the longitude
  ## thumb is either &amp;#39;none&amp;#39; (default), or &amp;#39;rot&amp;#39; (Garcia-Portugues, 2013)
  ## ret.all if set to TRUE returns a matrix with latitude, longitude and density
  ## full if set to TRUE calculates densities for the full sphere, otherwise
  ##   using extents of the data
  ## ngrid specifies the number of points taken at each axis
  n &amp;lt;- dim(u)[1]  ## sample size
  x &amp;lt;- euclid(u)

  if (thumb == &amp;quot;none&amp;quot;) {
    h &amp;lt;- as.numeric( vmfkde.tune(x, low = 0.1, up = 1)[1] )
  } else if (thumb == &amp;quot;rot&amp;quot;) {
    k &amp;lt;- vmf(x)$kappa
    h &amp;lt;- ( (8 * sinh(k)^2) / (k * n * ( (1 + 4 * k^2) * sinh(2 * k) -
    2 * k * cosh(2 * k)) ) ) ^ ( 1/6 )
  }

  if (full) {
    x1 &amp;lt;- seq( 0, 180, length = ngrid )  ## latitude
    x2 &amp;lt;- seq( 0, 360, length = ngrid )  ## longitude
  } else {
    x1 &amp;lt;- seq( min(u[, 1]) - 5, max(u[, 1]) + 5, length = ngrid )  ## latitude
    x2 &amp;lt;- seq( min(u[, 2]) - 5, max(u[, 2]) + 5, length = ngrid )  ## longitude
  }
  cpk &amp;lt;- 1 / (  ( h^2)^0.5 *(2 * pi)^1.5 * besselI(1/h^2, 0.5) )
  mat &amp;lt;- matrix(nrow = ngrid, ncol = ngrid)

  for (i in 1:ngrid) {
    for (j in 1:ngrid) {
      y &amp;lt;- euclid( c(x1[i], x2[j]) )
      a &amp;lt;- as.vector( tcrossprod(x, y / h^2) )
      can &amp;lt;- sum( exp(a + log(cpk)) ) / ngrid
      if (abs(can) &amp;lt; Inf)   mat[i, j] &amp;lt;- can
    }
  }

  if (ret.all) {
    return(list(Lat = x1, Long = x2, h = h, d = mat))
  } else {
    contour(mat$Lat, mat$Long, mat, nlevels = 10, col = 2, xlab = &amp;quot;Latitude&amp;quot;,
            ylab = &amp;quot;Longitude&amp;quot;)
    points(u[, 1], u[, 2])
  }
}&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;
&lt;div id=&#34;references&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Earthquake data was accessed through the &lt;a href=&#34;http://www.ncedc.org/anss/catalog-search.html&#34;&gt;Northern California Earthquake Data Center (NCEDC)&lt;/a&gt;, &lt;a href=&#34;doi:10.7932/NCEDC&#34; class=&#34;uri&#34;&gt;doi:10.7932/NCEDC&lt;/a&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>DartCannon: Estimating at the grocery store</title>
      <link>/blog/2018/05/dartcannon-estimating-at-the-grocery-store/</link>
      <pubDate>Mon, 28 May 2018 01:57:44 +0000</pubDate>
      
      <guid>/blog/2018/05/dartcannon-estimating-at-the-grocery-store/</guid>
      <description>

&lt;p&gt;To understand probability in forecasting, we can take a trip to the grocery store.&lt;/p&gt;

&lt;h3 id=&#34;what-we-39-re-going-to-do.display-2&#34;&gt;What We&amp;#8217;re Going To Do&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Demonstrate Estimation by shopping for produce&lt;/li&gt;
&lt;li&gt;Look at reducing uncertainty&lt;/li&gt;
&lt;li&gt;Explain reducible vs irreducible uncertainty&lt;/li&gt;
&lt;/ul&gt;

&lt;h2 id=&#34;the-basic-scenario.display-3&#34;&gt;The Basic Scenario&lt;/h2&gt;

&lt;p&gt;Lets say we&amp;#8217;re going to shop for ingredients for a fruit salad consisting of 2 apples, 1 banana and some grapes.&lt;/p&gt;

&lt;p&gt;Without going any further, you probably can make a reasonable guess about how much things will cost. I&amp;#8217;d say we&amp;#8217;d spend roughly the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Apples&lt;/strong&gt;: Between $2 and $4, but most likely $2.50&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Bananas&lt;/strong&gt;: $1 &amp;#8211; $2, most likely $1.50&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Grapes&lt;/strong&gt;: $2.50 &amp;#8211; $4.00, most likely $3.00&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Plus, at my store apples are on sale ~20% of the time which could take between .25 an .75 off. We can plug these estimates into &lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; giving us the following model:&lt;/p&gt;

&lt;p&gt;&lt;img src=&#34;https://dartcannon.com/static/img/fruitSalad_model1.5cf4c66.png&#34; alt=&#34;DC Fruitsalad Model&#34; /&gt;&lt;/p&gt;

&lt;p&gt;We can plus these numbers into &lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; to get the following distribution:&lt;/p&gt;

&lt;p&gt;&lt;img src=&#34;https://dartcannon.com/static/img/fruitSalad_results1.5803dc2.png&#34; alt=&#34;DC Fruitsalad Results&#34; /&gt;&lt;/p&gt;

&lt;p&gt;One thing &lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; does for us immediately which might not have been obvious otherwise is while our full range of estimates are between $4.75 and $10, a range of $5.25, the 90% range is only $2.&lt;/p&gt;

&lt;h2 id=&#34;reducing-uncertainty.display-3&#34;&gt;Reducing uncertainty&lt;/h2&gt;

&lt;p&gt;For some projects this level of estimation is good enough to make decisions, but perhaps we need more accuracy. For our grocery example, instead of relying on our existing feelings about prices and the historic chance of a sale, we could look up current prices and sales. Also finding average weights of the fruits.&lt;/p&gt;

&lt;p&gt;While finding current prices would reduce much of the uncertainty, while fruit (at least where we&amp;#8217;re based) is priced by weight, it is sold by unit. Since no piece of fruit is exactly the same weight, if we need 2 apples we can&amp;#8217;t say precisely how much that will weigh, so we still have some uncertainty.&lt;/p&gt;

&lt;p&gt;Lets say we went through this process&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Apples&lt;/strong&gt;: Between $2.20 and $3, but most likely $2.60&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Bananas&lt;/strong&gt;: $1.25 &amp;#8211; $1.50, most likely $1.30&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Grapes&lt;/strong&gt;: $3.50 &amp;#8211; $3.60, most likely $3.50&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;img src=&#34;https://dartcannon.com/static/img/fruitSalad_model2.830e4f8.png&#34; alt=&#34;DC Fruitsalad Model 2&#34; /&gt; &lt;img src=&#34;https://dartcannon.com/static/img/fruitSalad_results2.eaac205.png&#34; alt=&#34;DC Fruitsalad Results 2&#34; /&gt;&lt;/p&gt;

&lt;p&gt;We can see that the range is greatly reduced and the central 90% range is now only $0.50.&lt;/p&gt;

&lt;p&gt;While not a huge effort for making fruit salad, it may not always be worth it depending on the decisions we need to make and how much improving those estimates cost&lt;/p&gt;

&lt;h2 id=&#34;reducible-vs-irreducible-uncertainty&#34;&gt;Reducible vs. Irreducible Uncertainty&lt;/h2&gt;

&lt;p&gt;The exact price of the produce is called a &lt;em&gt;reducible&lt;/em&gt; source of uncertainty.
We were able to eliminate that source of uncertainty by putting in the effort to eliminate it completely.&lt;/p&gt;

&lt;p&gt;Not knowing the exact weight of our produce is &lt;em&gt;irreducible&lt;/em&gt; as we can&amp;#8217;t know what it will be until we actually go to the store to buy them.&lt;/p&gt;

&lt;p&gt;In most endeavors there are always a mix of reducible and irreducible sources of uncertainty. For reducible uncertainty, the question is always how much reducing the risk costs or can you live with it for the decisions you need to make.&lt;/p&gt;

&lt;p&gt;For irreducible risk, there are similar questions only instead of spending to reduce the uncertainty, the question becomes one of buying insurance or mitigating the risk if it is difficult to live with.&lt;/p&gt;

&lt;p&gt;&lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;Dartcannon&lt;/a&gt; can help guide discussions of risk by exposing the range of risks and helping focus on the most likely range of outcomes rather than the unlikely extremes.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Cross Posted from &lt;a href=&#34;https://dartcannon.com/blog/2018-estimating-at-the-grocery-store&#34; target=&#34;_blank&#34;&gt;https://dartcannon.com/blog/2018-estimating-at-the-grocery-store&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>Getting Started With PDS3</title>
      <link>/external/pds3/vignettes/pds3/</link>
      <pubDate>Tue, 22 May 2018 00:00:00 +0000</pubDate>
      
      <guid>/external/pds3/vignettes/pds3/</guid>
      <description>&lt;p&gt;PDS3 is a data standard used extensively by NASA for archiving data from science missions, maintained by JPL. While being replaced by PDS4, PDS4 covers all currently active missions and those covering the history of US space exploration.&lt;/p&gt;
&lt;p&gt;The R pds3 package provides tools for parsing PDS3 data, particularly the ODL label format which describes all the metadata of data collection. Want to plot a heatmap of Mars of all the images taken? This is the package for you!&lt;/p&gt;
&lt;div id=&#34;getting-started&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Getting Started&lt;/h2&gt;
&lt;p&gt;pds3 is available from CRAN, so can be simply installed via&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;install.packages(&amp;quot;pds3&amp;quot;)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;To install the development version you can use &lt;code&gt;devtools&lt;/code&gt; to install:&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;# Install the development version from GitHub:
install.packages(&amp;quot;devtools&amp;quot;) # If you don&amp;#39;t already have it
devtools::install_github(&amp;quot;mwaldstein/pds3&amp;quot;)&lt;/code&gt;&lt;/pre&gt;
&lt;div id=&#34;finding-data&#34; class=&#34;section level3&#34;&gt;
&lt;h3&gt;Finding Data&lt;/h3&gt;
&lt;p&gt;If you’re exploring this package, chances are you already have a collection of PDS files, but if you’re interested in exploring NASA data, the place to get started is &lt;a href=&#34;https://pds.jpl.nasa.gov/datasearch/data-search/&#34;&gt;The Planetary Data System&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Once you find a mission you’re interested in exploring, finding the data explorer is typically straight forward. The metadata this package processes is typically stored in .lbl files. For instance, we’ll look at the labels from the Mars Reconnaissance Orbiter’s HiRise experiment, particularly the data associated with &lt;a href=&#34;http://ode.rsl.wustl.edu/mars/indexproductpage.aspx?product_idgeo=13465046&#34;&gt;this image&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div id=&#34;reading-parsing-data&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Reading &amp;amp; Parsing Data&lt;/h2&gt;
&lt;p&gt;We’ll start by grabbing the label (LBL) file.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;href &amp;lt;- &amp;#39;http://hirise.lpl.arizona.edu/PDS/RDR/ESP/ORB_011700_011799/ESP_011707_1440/ESP_011707_1440_COLOR.LBL&amp;#39;
req &amp;lt;- curl::curl_fetch_memory(href)
dat &amp;lt;- rawToChar(req$content)&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;Now we have the data, we’ll process it using &lt;code&gt;parse_pds3&lt;/code&gt;.&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;library(pds3)
res &amp;lt;- pds3_read(dat)
str(res$odl)
#&amp;gt; List of 26
#&amp;gt;  $ PDS_VERSION_ID               : chr &amp;quot;PDS3&amp;quot;
#&amp;gt;  $ NOT_APPLICABLE_CONSTANT      : int -9998
#&amp;gt;  $ DATA_SET_ID                  : chr &amp;quot;MRO-M-HIRISE-3-RDR-V1.1&amp;quot;
#&amp;gt;  $ DATA_SET_NAME                : chr &amp;quot;MRO MARS HIGH RESOLUTION IMAGING SCIENCE\r\n                            EXPERIMENT RDR V1.1&amp;quot;
#&amp;gt;  $ PRODUCER_INSTITUTION_NAME    : chr &amp;quot;UNIVERSITY OF ARIZONA&amp;quot;
#&amp;gt;  $ PRODUCER_ID                  : chr &amp;quot;UA&amp;quot;
#&amp;gt;  $ PRODUCER_FULL_NAME           : chr &amp;quot;ALFRED MCEWEN&amp;quot;
#&amp;gt;  $ OBSERVATION_ID               : chr &amp;quot;ESP_011707_1440&amp;quot;
#&amp;gt;  $ PRODUCT_ID                   : chr &amp;quot;ESP_011707_1440_COLOR&amp;quot;
#&amp;gt;  $ PRODUCT_VERSION_ID           : chr &amp;quot;2.0&amp;quot;
#&amp;gt;  $ INSTRUMENT_HOST_NAME         : chr &amp;quot;MARS RECONNAISSANCE ORBITER&amp;quot;
#&amp;gt;  $ INSTRUMENT_HOST_ID           : chr &amp;quot;MRO&amp;quot;
#&amp;gt;  $ INSTRUMENT_NAME              : chr &amp;quot;HIGH RESOLUTION IMAGING SCIENCE EXPERIMENT&amp;quot;
#&amp;gt;  $ INSTRUMENT_ID                : chr &amp;quot;HIRISE&amp;quot;
#&amp;gt;  $ TARGET_NAME                  : chr &amp;quot;MARS&amp;quot;
#&amp;gt;  $ MISSION_PHASE_NAME           : chr &amp;quot;EXTENDED SCIENCE PHASE&amp;quot;
#&amp;gt;  $ ORBIT_NUMBER                 : int 11707
#&amp;gt;  $ SOURCE_PRODUCT_ID            : chr [1:12] &amp;quot;ESP_011707_1440_BG12_0&amp;quot; &amp;quot;ESP_011707_1440_BG12_1&amp;quot; &amp;quot;ESP_011707_1440_RED4_0&amp;quot; &amp;quot;ESP_011707_1440_RED4_1&amp;quot; ...
#&amp;gt;  $ RATIONALE_DESC               : chr &amp;quot;Channels on crater rim&amp;quot;
#&amp;gt;  $ SOFTWARE_NAME                : chr &amp;quot;PDS_to_JP2 v3.15.5 (1.49 2008/07/12 04:09:51)&amp;quot;
#&amp;gt;  $ IMAGE_MAP_PROJECTION         :List of 24
#&amp;gt;   ..$ ^DATA_SET_MAP_PROJECTION    :List of 2
#&amp;gt;   .. ..$ value : chr &amp;quot;DSMAP.CAT&amp;quot;
#&amp;gt;   .. ..$ offset: num -1
#&amp;gt;   ..$ MAP_PROJECTION_TYPE         : chr &amp;quot;EQUIRECTANGULAR&amp;quot;
#&amp;gt;   ..$ PROJECTION_LATITUDE_TYPE    : chr &amp;quot;PLANETOCENTRIC&amp;quot;
#&amp;gt;   ..$ A_AXIS_RADIUS               :List of 2
#&amp;gt;   .. ..$ value: num 3390
#&amp;gt;   .. ..$ unit : chr &amp;quot;KM&amp;quot;
#&amp;gt;   ..$ B_AXIS_RADIUS               :List of 2
#&amp;gt;   .. ..$ value: num 3390
#&amp;gt;   .. ..$ unit : chr &amp;quot;KM&amp;quot;
#&amp;gt;   ..$ C_AXIS_RADIUS               :List of 2
#&amp;gt;   .. ..$ value: num 3390
#&amp;gt;   .. ..$ unit : chr &amp;quot;KM&amp;quot;
#&amp;gt;   ..$ COORDINATE_SYSTEM_NAME      : chr &amp;quot;PLANETOCENTRIC&amp;quot;
#&amp;gt;   ..$ POSITIVE_LONGITUDE_DIRECTION: chr &amp;quot;EAST&amp;quot;
#&amp;gt;   ..$ KEYWORD_LATITUDE_TYPE       : chr &amp;quot;PLANETOCENTRIC&amp;quot;
#&amp;gt;   ..$ CENTER_LATITUDE             :List of 2
#&amp;gt;   .. ..$ value: num -35
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ CENTER_LONGITUDE            :List of 2
#&amp;gt;   .. ..$ value: num 180
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ LINE_FIRST_PIXEL            : int 1
#&amp;gt;   ..$ LINE_LAST_PIXEL             : int 55636
#&amp;gt;   ..$ SAMPLE_FIRST_PIXEL          : int 1
#&amp;gt;   ..$ SAMPLE_LAST_PIXEL           : int 10633
#&amp;gt;   ..$ MAP_PROJECTION_ROTATION     :List of 2
#&amp;gt;   .. ..$ value: num 0
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ MAP_RESOLUTION              :List of 2
#&amp;gt;   .. ..$ value: num 236637
#&amp;gt;   .. ..$ unit : chr &amp;quot;PIX/DEG&amp;quot;
#&amp;gt;   ..$ MAP_SCALE                   :List of 2
#&amp;gt;   .. ..$ value: num 0.25
#&amp;gt;   .. ..$ unit : chr &amp;quot;METERS/PIXEL&amp;quot;
#&amp;gt;   ..$ MAXIMUM_LATITUDE            :List of 2
#&amp;gt;   .. ..$ value: num -35.4
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ MINIMUM_LATITUDE            :List of 2
#&amp;gt;   .. ..$ value: num -35.6
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ LINE_PROJECTION_OFFSET      :List of 2
#&amp;gt;   .. ..$ value: num -8377212
#&amp;gt;   .. ..$ unit : chr &amp;quot;PIXEL&amp;quot;
#&amp;gt;   ..$ SAMPLE_PROJECTION_OFFSET    :List of 2
#&amp;gt;   .. ..$ value: num 7151126
#&amp;gt;   .. ..$ unit : chr &amp;quot;PIXEL&amp;quot;
#&amp;gt;   ..$ EASTERNMOST_LONGITUDE       :List of 2
#&amp;gt;   .. ..$ value: num 143
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ WESTERNMOST_LONGITUDE       :List of 2
#&amp;gt;   .. ..$ value: num 143
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;  $ TIME_PARAMETERS              :List of 6
#&amp;gt;   ..$ MRO:OBSERVATION_START_TIME  : POSIXlt[1:1], format: &amp;quot;2009-01-24 17:07:14&amp;quot;
#&amp;gt;   ..$ START_TIME                  : POSIXlt[1:1], format: &amp;quot;2009-01-24 17:07:14&amp;quot;
#&amp;gt;   ..$ SPACECRAFT_CLOCK_START_COUNT: chr &amp;quot;917284056:11103&amp;quot;
#&amp;gt;   ..$ STOP_TIME                   : POSIXlt[1:1], format: &amp;quot;2009-01-24 17:07:19&amp;quot;
#&amp;gt;   ..$ SPACECRAFT_CLOCK_STOP_COUNT : chr &amp;quot;917284060:44707&amp;quot;
#&amp;gt;   ..$ PRODUCT_CREATION_TIME       : POSIXlt[1:1], format: &amp;quot;2009-09-12 04:31:46&amp;quot;
#&amp;gt;  $ INSTRUMENT_SETTING_PARAMETERS:List of 4
#&amp;gt;   ..$ MRO:CCD_FLAG               : chr [1:14] &amp;quot;ON&amp;quot; &amp;quot;ON&amp;quot; &amp;quot;ON&amp;quot; &amp;quot;ON&amp;quot; ...
#&amp;gt;   ..$ MRO:BINNING                : int [1:14] -9998 -9998 -9998 -9998 1 1 -9998 -9998 -9998 -9998 ...
#&amp;gt;   ..$ MRO:TDI                    : int [1:14] -9998 -9998 -9998 -9998 128 128 -9998 -9998 -9998 -9998 ...
#&amp;gt;   ..$ MRO:SPECIAL_PROCESSING_FLAG: chr [1:14] &amp;quot;NULL&amp;quot; &amp;quot;NULL&amp;quot; &amp;quot;NULL&amp;quot; &amp;quot;NULL&amp;quot; ...
#&amp;gt;  $ VIEWING_PARAMETERS           :List of 7
#&amp;gt;   ..$ INCIDENCE_ANGLE  :List of 2
#&amp;gt;   .. ..$ value: num 60.9
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ EMISSION_ANGLE   :List of 2
#&amp;gt;   .. ..$ value: num 0.27
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ PHASE_ANGLE      :List of 2
#&amp;gt;   .. ..$ value: num 61
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ LOCAL_TIME       :List of 2
#&amp;gt;   .. ..$ value: num 16
#&amp;gt;   .. ..$ unit : chr &amp;quot;LOCALDAY/24&amp;quot;
#&amp;gt;   ..$ SOLAR_LONGITUDE  :List of 2
#&amp;gt;   .. ..$ value: num 197
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ SUB_SOLAR_AZIMUTH:List of 2
#&amp;gt;   .. ..$ value: num 193
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;   ..$ NORTH_AZIMUTH    :List of 2
#&amp;gt;   .. ..$ value: num 270
#&amp;gt;   .. ..$ unit : chr &amp;quot;DEG&amp;quot;
#&amp;gt;  $ COMPRESSED_FILE              :List of 8
#&amp;gt;   ..$ FILE_NAME                 : chr &amp;quot;ESP_011707_1440_COLOR.JP2&amp;quot;
#&amp;gt;   ..$ RECORD_TYPE               : chr &amp;quot;UNDEFINED&amp;quot;
#&amp;gt;   ..$ ENCODING_TYPE             : chr &amp;quot;JP2&amp;quot;
#&amp;gt;   ..$ ENCODING_TYPE_VERSION_NAME: chr &amp;quot;ISO/IEC15444-1:2004&amp;quot;
#&amp;gt;   ..$ INTERCHANGE_FORMAT        : chr &amp;quot;BINARY&amp;quot;
#&amp;gt;   ..$ UNCOMPRESSED_FILE_NAME    : chr &amp;quot;ESP_011707_1440_COLOR.IMG&amp;quot;
#&amp;gt;   ..$ REQUIRED_STORAGE_BYTES    :List of 2
#&amp;gt;   .. ..$ value: int NA
#&amp;gt;   .. ..$ unit : chr &amp;quot;BYTES&amp;quot;
#&amp;gt;   ..$ ^DESCRIPTION              :List of 2
#&amp;gt;   .. ..$ value : chr &amp;quot;JP2INFO.TXT&amp;quot;
#&amp;gt;   .. ..$ offset: num -1
#&amp;gt;  $ UNCOMPRESSED_FILE            :List of 6
#&amp;gt;   ..$ FILE_NAME   : chr &amp;quot;ESP_011707_1440_COLOR.IMG&amp;quot;
#&amp;gt;   ..$ RECORD_TYPE : chr &amp;quot;FIXED_LENGTH&amp;quot;
#&amp;gt;   ..$ RECORD_BYTES:List of 2
#&amp;gt;   .. ..$ value: int 21266
#&amp;gt;   .. ..$ unit : chr &amp;quot;BYTES&amp;quot;
#&amp;gt;   ..$ FILE_RECORDS: int 166908
#&amp;gt;   ..$ ^IMAGE      :List of 2
#&amp;gt;   .. ..$ value : chr &amp;quot;ESP_011707_1440_COLOR.IMG&amp;quot;
#&amp;gt;   .. ..$ offset: num -1
#&amp;gt;   ..$ IMAGE       :List of 19
#&amp;gt;   .. ..$ DESCRIPTION               : chr &amp;quot;HiRISE projected and mosaicked product&amp;quot;
#&amp;gt;   .. ..$ LINES                     : int 55636
#&amp;gt;   .. ..$ LINE_SAMPLES              : int 10633
#&amp;gt;   .. ..$ BANDS                     : int 3
#&amp;gt;   .. ..$ SAMPLE_TYPE               : chr &amp;quot;MSB_UNSIGNED_INTEGER&amp;quot;
#&amp;gt;   .. ..$ SAMPLE_BITS               : int 16
#&amp;gt;   .. ..$ SAMPLE_BIT_MASK           : int 1023
#&amp;gt;   .. ..$ SCALING_FACTOR            : num 0.000136
#&amp;gt;   .. ..$ OFFSET                    : num 0.0332
#&amp;gt;   .. ..$ BAND_STORAGE_TYPE         : chr &amp;quot;BAND_SEQUENTIAL&amp;quot;
#&amp;gt;   .. ..$ CORE_NULL                 : int 0
#&amp;gt;   .. ..$ CORE_LOW_REPR_SATURATION  : int 1
#&amp;gt;   .. ..$ CORE_LOW_INSTR_SATURATION : int 2
#&amp;gt;   .. ..$ CORE_HIGH_REPR_SATURATION : int 1023
#&amp;gt;   .. ..$ CORE_HIGH_INSTR_SATURATION: int 1022
#&amp;gt;   .. ..$ CENTER_FILTER_WAVELENGTH  :List of 3
#&amp;gt;   .. .. ..$ :List of 2
#&amp;gt;   .. .. .. ..$ value: int 900
#&amp;gt;   .. .. .. ..$ unit : chr &amp;quot;NM&amp;quot;
#&amp;gt;   .. .. ..$ :List of 2
#&amp;gt;   .. .. .. ..$ value: int 700
#&amp;gt;   .. .. .. ..$ unit : chr &amp;quot;NM&amp;quot;
#&amp;gt;   .. .. ..$ :List of 2
#&amp;gt;   .. .. .. ..$ value: int 500
#&amp;gt;   .. .. .. ..$ unit : chr &amp;quot;NM&amp;quot;
#&amp;gt;   .. ..$ MRO:MINIMUM_STRETCH       : int [1:3] 3 3 3
#&amp;gt;   .. ..$ MRO:MAXIMUM_STRETCH       : int [1:3] 1021 1021 1021
#&amp;gt;   .. ..$ FILTER_NAME               : chr [1:3] &amp;quot;NEAR-INFRARED&amp;quot; &amp;quot;RED&amp;quot; &amp;quot;BLUE-GREEN&amp;quot;&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;
&lt;div id=&#34;cautionary-notes&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;Cautionary Notes&lt;/h2&gt;
&lt;p&gt;All of the data you can access from NASA missions is public, but be aware that if you are going to download a large amount, there are almost always better ways than web scraping - be it either a catalog-only download or via FTP, investigate and be sure to be considerate.&lt;/p&gt;
&lt;p&gt;Also of note that the data is public and free to use, it is the product of a lot of hard work and it is appropriate to cite the data source and the principle investigators of the instrument whose data you are using. NASA provides a guide to &lt;a href=&#34;https://pds.nasa.gov/datastandards/pds3/citing-pds3-data.shtml&#34;&gt;Citing PDS3 Data&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;One cool thing is that many PDS catalogs include citation information in a PDS3 formatted file we can use this package to extract!&lt;/p&gt;
&lt;pre class=&#34;r&#34;&gt;&lt;code&gt;cit_href &amp;lt;- &amp;quot;https://hirise.lpl.arizona.edu/PDS/CATALOG/RDR_DS.CAT&amp;quot;
cit_req &amp;lt;- curl::curl_fetch_memory(cit_href)
cit_dat &amp;lt;- rawToChar(cit_req$content)
cit_res &amp;lt;- pds3_read(cit_dat)

cit_res$odl$DATA_SET$DATA_SET_INFORMATION$CITATION_DESC
#&amp;gt; [1] &amp;quot;McEwen, A., Mars Reconnaissance \r\n      Orbiter High Resolution Imaging Science Experiment, Reduced \r\n      Data Record, MRO-M-HIRISE-3-RDR-V1.0, NASA Planetary Data\r\n      System, 2007.&amp;quot;&lt;/code&gt;&lt;/pre&gt;
&lt;/div&gt;
&lt;div id=&#34;references&#34; class=&#34;section level2&#34;&gt;
&lt;h2&gt;References&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;The MRO-M-HIRISE-3-RDR-V1.0 data set was obtained from the &lt;a href=&#34;https://pds.nasa.gov&#34;&gt;Planetary Data System (PDS)&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;McEwen, A., Mars Reconnaissance Orbiter High Resolution Imaging Science Experiment, Reduced Data Record, MRO-M-HIRISE-3-RDR-V1.0, NASA Planetary Data System, 2007.&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
</description>
    </item>
    
    <item>
      <title>DartCannon: Where We’re Going</title>
      <link>/blog/2018/05/dartcannon-where-were-going/</link>
      <pubDate>Mon, 21 May 2018 01:54:38 +0000</pubDate>
      
      <guid>/blog/2018/05/dartcannon-where-were-going/</guid>
      <description>

&lt;p&gt;We&amp;#8217;re so exited to see the response to our launch, we wanted to let you know what you can expect from us in the coming months. To fulfill out mission of bringing advanced tools to leaders at all levels, we launched with just the basics in place and have so much more planned.&lt;/p&gt;

&lt;p&gt;While we can&amp;#8217;t commit to a specific date, here is some of what we&amp;#8217;re working on:&lt;/p&gt;

&lt;h3 id=&#34;shared-simulations&#34;&gt;Shared Simulations&lt;/h3&gt;

&lt;p&gt;To keep things simple, we currently do now allow sharing simulations. Our first major feature is to allow sharing simulations for collaboration and in a read-only mode. This will also allow you to share simulations publicly so anyone with the link can view it.&lt;/p&gt;

&lt;h3 id=&#34;scrum-simulations&#34;&gt;Scrum Simulations&lt;/h3&gt;

&lt;p&gt;We know that not everyone runs their projects in the same way and while we’ve always planned on supporting agile methodologies, finding the right general approach has delayed our first implementation. In the coming months we’ll be releasing a third simulation type to support people using scrum/agile methods for projects.&lt;/p&gt;

&lt;h3 id=&#34;powerpoint-export&#34;&gt;PowerPoint Export&lt;/h3&gt;

&lt;p&gt;While we don’t expect anyone to immediately turn around and project our presentations, we do want to provide slides people can use to share plans and progress in easily editable formats.&lt;/p&gt;

&lt;h3 id=&#34;correlated-items&#34;&gt;Correlated Items&lt;/h3&gt;

&lt;p&gt;A major assumption of our overall approach is that individual tasks or line items are independent of one another. In practice however, actual outcomes tend to all run low or high. This feature will be as easy to use as the rest of DartCannon and will truly provide insights unavailable from other easily accessible tools.&lt;/p&gt;

&lt;h3 id=&#34;combined-schedule-budget&#34;&gt;Combined Schedule / Budget&lt;/h3&gt;

&lt;p&gt;While many projects have the benefit of only needing to worry about budget or schedule independently, if you don’t have that luxury you’ll need to see how they both come together.&lt;/p&gt;

&lt;h2 id=&#34;when-you-can-expect-them&#34;&gt;When You can expect them&lt;/h2&gt;

&lt;p&gt;We’re still working on prioritizing all these features and would love to hear from you in what would be most beneficial to your work. We’ll always continue with new tutorials and guides to getting the most out of &lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; along with bug fixes and minor quality of life improvements.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Cross-posted from &lt;a href=&#34;https://dartcannon.com/blog/2018-where-we&#39;re-going&#34; target=&#34;_blank&#34;&gt;https://dartcannon.com/blog/2018-where-we&amp;#8217;re-going&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>DartCannon</title>
      <link>/project/dartcannon/</link>
      <pubDate>Thu, 17 May 2018 00:00:00 -0400</pubDate>
      
      <guid>/project/dartcannon/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; was founded on the idea that advanced tools should be available to all
managers in an easy to use form. After gestating for 10 years, the time and
technology is finally at a point where tools which previously cost $1,000&amp;rsquo;s per
seat can be accessible by everyone.&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>edgarWebR</title>
      <link>/project/edgarwebr/</link>
      <pubDate>Thu, 17 May 2018 00:00:00 -0400</pubDate>
      
      <guid>/project/edgarwebr/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://mwaldstein.github.io/edgarWebR/&#34; target=&#34;_blank&#34;&gt;edgarWebR&lt;/a&gt; is a R library which provides an interface to access the SEC&amp;rsquo;s EDGAR system for company financial filings.&lt;/p&gt;

&lt;p&gt;edgarWebR does not provide any functionality to extract financial data or other information from filings, only the metadata and company information.&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>pds3</title>
      <link>/project/pds3/</link>
      <pubDate>Thu, 17 May 2018 00:00:00 -0400</pubDate>
      
      <guid>/project/pds3/</guid>
      <description>

&lt;p&gt;&lt;a href=&#34;https://github.com/mwaldstein/pds3&#34; target=&#34;_blank&#34;&gt;pds3&lt;/a&gt; reads &lt;a href=&#34;https://pds.jpl.nasa.gov/datastandards/pds3/&#34; target=&#34;_blank&#34;&gt;PDS3&lt;/a&gt; files, a standard published by JPL and used throughout
NASA space missions. While PDS3 is being supplanted by PDS4, a XML based standard,
PDS3 is still being used and is needed for accessing historic data.&lt;/p&gt;

&lt;h2 id=&#34;references&#34;&gt;References&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href=&#34;https://pds.jpl.nasa.gov/datastandards/pds3/&#34; target=&#34;_blank&#34;&gt;PDS3 Data Standards&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://pds.jpl.nasa.gov/datastandards/pds3/standards/&#34; target=&#34;_blank&#34;&gt;PDS3 Standards Reference&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href=&#34;https://pds.jpl.nasa.gov/&#34; target=&#34;_blank&#34;&gt;PDS Data Archive&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
</description>
    </item>
    
    <item>
      <title>Introducing DartCannon</title>
      <link>/blog/2018/05/introducing-dartcannon/</link>
      <pubDate>Tue, 15 May 2018 01:51:59 +0000</pubDate>
      
      <guid>/blog/2018/05/introducing-dartcannon/</guid>
      <description>&lt;p&gt;Most estimation is taking a single shot in the dark.
&lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; exists to let you take thousands of shots on the goal &amp;#8211; firing as many darts at the dart board as needed to get an understanding of where they&amp;#8217;ll fall.
Previously this capability was limited to planning departments with deep pockets, willing to shell out for esoteric, complicated pieces of software.
&lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; changes the game, bringing those advanced tools to a price point and simplicity where anyone can use them.&lt;/p&gt;

&lt;p&gt;And you can start for free. We&amp;#8217;re committed to improving estimation we&amp;#8217;re letting everyone use the basics of out tool for no out of pocket expense and no credit card required.&lt;/p&gt;

&lt;p&gt;Of course we hope you stick around and subscribe to access out premium features &amp;#8211;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Unlimited Simulations&lt;/li&gt;
&lt;li&gt;Unlimited Complexity to Simulations&lt;/li&gt;
&lt;li&gt;High-Resolution Simulations&lt;/li&gt;
&lt;li&gt;Excel&amp;trade; Import and Export&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We plan to continue to push the envelope of what you can do with &lt;a href=&#34;https://dartcannon.com&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt; and help ensure that leaders at all levels have access to the most advanced tools to move the art of management forward. We&amp;#8217;re not done either &amp;#8211; in the coming weeks we&amp;#8217;ll be sharing our roadmap and how we hope to continue to improve and provide more features, both for free and premium users.&lt;/p&gt;

&lt;p&gt;Sign up today: &lt;a href=&#34;https://dartcannon.com/login&#34; target=&#34;_blank&#34;&gt;https://dartcannon.com/login&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Cross-Posted from the new startup I&amp;rsquo;ve been working on, &lt;a href=&#34;https://dartcannon.com/blog/2018-welcome-to-dartcannon&#34; target=&#34;_blank&#34;&gt;DartCannon&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>MPOD #5 – George (1/n)</title>
      <link>/blog/2018/01/mpod-5-george-1-n/</link>
      <pubDate>Fri, 19 Jan 2018 20:00:39 +0000</pubDate>
      
      <guid>/blog/2018/01/mpod-5-george-1-n/</guid>
      <description>&lt;p&gt;&lt;img class=&#34;alignnone size-full wp-image-487&#34; src=&#34;https://micah.waldste.in/blog/wp-content/uploads/2018/01/DSCF3960.jpg&#34; alt=&#34;&#34; /&gt;&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>MPOD #4 – Hadrian’s Wall</title>
      <link>/blog/2018/01/mpod-4-hadrians-wall/</link>
      <pubDate>Thu, 18 Jan 2018 17:00:01 +0000</pubDate>
      
      <guid>/blog/2018/01/mpod-4-hadrians-wall/</guid>
      <description>&lt;p&gt;&lt;img class=&#34;alignnone size-full wp-image-484&#34; src=&#34;https://micah.waldste.in/blog/wp-content/uploads/2018/01/IMG_0901.jpg&#34; alt=&#34;&#34; width=&#34;3349&#34; height=&#34;2162&#34; srcset=&#34;https://micah.waldste.in/blog/wp-content/uploads/2018/01/IMG_0901.jpg 3349w, https://micah.waldste.in/blog/wp-content/uploads/2018/01/IMG_0901-300x194.jpg 300w, https://micah.waldste.in/blog/wp-content/uploads/2018/01/IMG_0901-768x496.jpg 768w, https://micah.waldste.in/blog/wp-content/uploads/2018/01/IMG_0901-1024x661.jpg 1024w&#34; sizes=&#34;(max-width: 706px) 89vw, (max-width: 767px) 82vw, 740px&#34; /&gt;&lt;/p&gt;
</description>
    </item>
    
  </channel>
</rss>
