Data Visualization – Golden Software https://www.goldensoftware.com Wed, 04 Mar 2026 21:49:24 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 252859503 What Is a Bar Chart: A Quick-Start Guide to Effective Application & Design https://www.goldensoftware.com/what-is-a-bar-chart-a-quick-start-guide/ https://www.goldensoftware.com/what-is-a-bar-chart-a-quick-start-guide/#respond Wed, 04 Mar 2026 19:48:21 +0000 https://www.goldensoftware.com/?p=17293
What is a bar chart? This image shows a professional providing the answer during a virtual meeting.

What Is a Bar Chart: A Quick-Start Guide to Effective Application & Design

Bar charts are one of the most effective tools for scientific communication. In geoscience and engineering workflows—where datasets are diverse, stakeholders vary in technical expertise, and decisions depend on clear interpretation—bar charts offer a level of clarity that more complex charts sometimes obscure. But to really show why bar charts are such effective visuals, we’re breaking down not only what a bar chart is, but also the benefits it brings and the best way to design one, so you use it with confidence and clarity in your own projects.

What Is a Bar Chart?

At its core, a bar chart is one of the simplest and most powerful ways to compare values across discrete categories. With this visual, each bar represents a category, and the length or height of a bar corresponds directly to its value. This makes patterns, differences, and rankings immediately visible.

Additionally, bar charts come in two common orientations:

  • Vertical bar charts: These are ideal for showing changes or comparisons along a natural left-to-right progression.
  • Horizontal bar charts: These excel when category names are long, numerous, or better read top-to-bottom.

What makes bar charts especially effective is their intuitiveness. Our eyes easily compare lengths, so even complex datasets become clearer when placed in a bar-based format. That accessibility is why bar charts remain a foundational tool in scientific communication.

This is a vertical bar chart showcasing data on U.S. trade goods and services.
This is a horizontal bar chart showcasing data on homes sold in 2015 in Denver, Colorado.
This image is an example of a Polar Bar Chart, a scientific and technical visualization
This image is an example of a Wind Rose Diagram, a technical and scientific bar chart.

Why Bar Charts Work Well for Categorized Data

Now that we’ve defined a bar chart, the next question is: what makes bar charts uniquely effective for categorized data? In geoscience and engineering workflows, categorized data appears constantly—whether you’re comparing sample sites, evaluating different materials, reviewing test conditions, or summarizing results from multiple scenarios. Because these categories are discrete rather than continuous, you need a visual that makes comparisons clean, direct, and unmistakable.

That’s exactly where bar charts excel. By giving each category its own clearly separated bar, they make differences in magnitude instantly visible. You don’t have to interpret curves or infer trends; the comparison is built right into the visual structure. This clarity becomes especially valuable in situations such as the following:

  • Comparing measurements across locations, samples, or materials, like contaminant levels across wells or mineral content across drillholes.
  • Summarizing results from experiments or simulations, where each scenario or parameter set forms its own category.
  • Visualizing frequency counts or categorical distributions, such as soil classifications or event occurrences.
  • Supporting quick comparisons in reports and presentations, where stakeholders need insight quickly.

Another key advantage is that bar charts avoid implying continuity where none exists. Unlike line charts, they don’t suggest that values flow from one category into the next. This prevents misinterpretation and ensures your stakeholders see the data accurately.

When a Bar Chart Is Better Than Other Visuals

On top of understanding the advantages of bar charts, it’s equally important to know when bar charts outperform other types of data visualizations entirely. Below are a few types of plots that are just as simple and powerful at communicating scientific data, but aren’t the best choice in certain circumstances.

Bar charts vs. line charts

Line charts imply continuity and progression. They connect points as if one value flows naturally into the next. But in some geoscience and engineering workflows, your categories aren’t continuous. Examples are sampling locations, material types, field sites, treatment conditions, and borehole IDs. Using a line chart here artificially suggests a trend where none exists. A bar chart avoids that trap by keeping each category independent and clearly labeled.

Bar charts vs scatter plots

Scatter plots are powerful for relationships, but when your goal is comparison—not correlation—they can introduce unnecessary noise. If you simply need to show that Site B has twice the concentration of Site A, or that Sample 4 produced the lowest value, a bar chart does that in a single glance. It doesn’t require axes interpretation, point clusters, or guesswork.

Bar charts vs tables

Tables are great for storing values, but they’re ineffective at communicating patterns. Your stakeholders would have to scan numbers, calculate differences mentally, and search for the highest or lowest value. Bar charts eliminate that cognitive burden by turning comparisons into immediate, visual insight. Instead of reading, calculating, and interpreting, your stakeholders can simply see the insights.

Key Design Principles for Accurate and Effective Bar Charts

While bar charts have specific purposes and benefits over other visuals, their design still matters. Strong design choices ensure your bar charts make comparisons clear, intuitive, and trustworthy. That said, here are some essential design principles that will help your bar charts communicate insight with accuracy and clarity.

1. Start axes at zero to avoid misleading comparisons

Because bar length visually represents value, starting the axis anywhere other than zero can exaggerate small differences and distort your data. A zero baseline keeps comparisons honest and easy to interpret.

2. Limit the number of categories so insights stay visible

Bar charts shine when a stakeholder can compare values quickly. Overloading your chart with too many categories makes bars thin, labels cramped, and patterns difficult to spot. A focused set of categories keeps your chart readable and the insight clear.

3. Use color purposefully, not decoratively

Color should clarify relationships, not complicate them. Consistent color choices help reinforce grouping and comparison, while avoiding unnecessary or overly vibrant palettes keeps attention on the data rather than the design.

4. Ensure labels are clear, readable, and helpful

If someone has to squint, rotate the document, or guess at a label’s meaning, your chart loses its impact. Choose straightforward category names, keep axis values legible, and position labels horizontally whenever possible; if you find yourself tilting or rotating text to make it fit, it’s usually a sign you should switch to a horizontal bar chart.

5. Remove clutter that distracts from the main message

Excessive gridlines, 3D effects, heavy borders, and ornamental design elements often obscure the point of the chart. Simplicity is your ally. A clean layout puts the focus exactly where it belongs: on the differences between the bars.

Bringing Clarity, Comparison, and Confidence Together

Bar charts remain one of the most effective tools for geoscientists and engineers because they make comparisons clear, emphasize meaningful differences between categories, and help stakeholders interpret results quickly and confidently. That’s why they deserve a spot in your workflow, especially when you’re working on a project where they’ll shine. Just make sure you design it effectively so stakeholders also experience the power and benefits of bar charts. 

Want to create high-quality, professional bar charts that inform your stakeholders? Explore the ready-to-use bar chart templates in the Golden Gallery, so you don’t start from scratch and have a strong design foundation for communicating insights.

FAQ: Bar Charts

What is a bar chart used for in geoscience and engineering work?2026-03-04T14:30:05-07:00

Bar charts are ideal for comparing values across discrete categories, such as sampling locations, material types, experimental conditions, or time periods (like months or years) that aren’t continuous. They make differences easy to see at a glance and help audiences quickly interpret which categories are higher, lower, or significantly different.

When should I use a bar chart instead of a line chart?2026-03-04T14:30:46-07:00

Use a bar chart when your data represents distinct categories rather than a continuous sequence. Line charts imply continuity or trends over time. If your values don’t naturally connect from one category to the next, a bar chart provides a clearer, more accurate representation without suggesting a trend that isn’t there.

When should I use a bar chart instead of a scatter plot?2026-03-04T14:31:31-07:00

Choose a bar chart when your goal is to compare groups, show frequency counts, or summarize categorized results. Scatter plots are designed to highlight relationships or correlations between two numerical variables. If your data isn’t about correlations—but instead about comparing categories—bar charts communicate those differences more clearly and without unnecessary noise.

How do I choose the right type of bar chart?2026-03-04T14:32:14-07:00

Choose a vertical bar chart when comparing categories from left to right—for example, sample sites or material classes. Choose a horizontal bar chart when category names are long or when you have many categories that would be hard to fit on a vertical axis.

What is the best software for creating professional bar charts?2026-03-04T14:33:55-07:00

For geoscientists and engineers, Grapher is one of the most powerful and flexible tools for creating professional bar charts. You can build bar charts from scratch or start with a template, then customize colors, spacing, labels, axes, and patterns to match your project’s needs. Grapher gives you control over the design to ensure your visuals look clean, clear, and publication-ready.

Where can I find free bar chart templates online?2026-03-04T14:35:10-07:00

You can find free, ready-to-use bar chart templates in the Golden Gallery, which is Golden Software’s curated library of templates made by and for scientists and engineers. If you’re using the latest version of Grapher, you can use bar chart templates from the Golden Gallery, customize them with your data, and quickly produce polished visuals.

What types of data work poorly with bar charts?2026-03-04T14:35:46-07:00

Bar charts aren’t ideal for continuous datasets, correlated variables, or situations where continuous real-time data matter more than categorical comparison. In those cases, line charts, scatter plots, or histograms often reveal patterns more effectively.

How many categories are “too many” for a bar chart?2026-03-04T14:36:27-07:00

As a general rule, if your audience can’t easily distinguish the bars at a glance, you have too many. Eight to twelve categories are ideal; beyond that, comparisons become harder, especially in presentations. Consider grouping categories or using another chart type if clarity starts to drop.

How do I make my bar chart easier for non-technical audiences to understand?2026-03-04T14:37:18-07:00

Use clear labels, consistent color schemes, and straightforward axis scaling. Avoid jargon in category names and emphasize the key takeaway in your title or caption. Simplicity, alignment, and clean spacing go a long way toward improving clarity.

]]>
https://www.goldensoftware.com/what-is-a-bar-chart-a-quick-start-guide/feed/ 0 17293
Data Presentation That Makes an Impact: How to Communicate Insights Clearly https://www.goldensoftware.com/impactful-data-presentation/ https://www.goldensoftware.com/impactful-data-presentation/#respond Tue, 17 Feb 2026 17:51:38 +0000 https://www.goldensoftware.com/?p=16590
This professional is giving a presentation about data they visualized.

Data Presentation That Makes an Impact: How to Communicate Insights Clearly

In geoscience and engineering, doing great data collection and analysis is only half the job. The other half—and sometimes the harder one—is communicating the insights that actually came from those efforts. No matter how rigorous your data gathering or how precise your calculations, your insights only make an impact when stakeholders understand them.

Unfortunately, that’s where many projects stumble. Insights can lose their value if visuals aren’t clear. When maps, models, or graphs are cluttered, confusing, or overly technical, stakeholders can easily misinterpret takeaways, leading to a decrease in confidence, rejected recommendations, and poor decision-making.

That’s why great data presentation is a core professional skill. Being able to share insights clearly, whether in 2D or 3D, equips you to bridge the gap between complex data collection and analysis and actionable understanding.

Common Data Presentation Mistakes (and How to Avoid Them)

Now, what steps can help enhance data presentation skills? First, it’s important to know what mistakes to avoid. Even skilled geoscientists and engineers can unintentionally make choices that weaken their visuals, causing key insights to get lost or misunderstood. To avoid that from happening, here are a few of the most common data presentation mistakes geoscientists and engineers make.

Mistake 1: Placing Importance on Every Data Point

When every dataset, layer, or variable is shown at once, the main insights get buried. Instead of clarifying the takeaways, the visual becomes overwhelming, leaving stakeholders confused about what matters and what they should focus on. The result? A misinterpretation of the findings, leading to delayed decisions.

Mistake 2: Using Default Settings Without Question

Default color scales, contour intervals, or axis scales are great, but they’re not always tailored to your specific data. They can unintentionally mask important variations or exaggerate patterns that don’t actually exist. Relying on them every time—without considering customizations you can make—leads to maps, models, and graphs that don’t effectively honor your data, creating confusion rather than clarity.

Mistake 3: Assuming the Audience Understands the Data

Because you’re familiar with every assumption, limitation, and nuance of your data, it’s easy to forget your audience isn’t, especially if they’re non-technical stakeholders. When labels, legends, or visual cues aren’t clear, your audience will have to guess at meanings, and guesswork almost always leads to misunderstanding or hesitation.

Best Practices for Clear, Confident Data Presentation

Once you know what mistakes to avoid, the next step is to simply implement best practices that’ll help you communicate insights. However, keep this in mind: strong data presentation isn’t just about making your map, model, or graph “pretty”; it’s about making intentional design choices that give stakeholders a clear understanding of your main takeaways so they can take informed action. That said, here are some core best practices for approaching data presentation effectively. 

Choose the Right Visualization for the Story You Need to Tell

The foundation of effective data presentation is choosing the visual format. No single map or graph works for every scenario. Different visuals highlight different relationships, and selecting the wrong one can unintentionally obscure your insights. Below is a quick look at common and specialized map and graph types and when each one is the right choice.

When to Use Different Map Types

  • Contour Maps: Best for showing smooth, continuous variation across a surface—like groundwater levels, temperature, or topography.
  • Color Relief Maps: Ideal when you want audiences to quickly grasp spatial gradients or intensity changes using intuitive color scales.
  • Shaded Relief Maps: Use these when terrain interpretation matters; shading creates a realistic sense of elevation and slope.
  • 3D Surface Maps: Choose this when elevation, volume, or surface geometry is essential to understanding your data, especially when depth relationships influence interpretation.
  • Base Maps: Useful when your goal is to anchor your data to real-world locations using imagery, streets, or land cover for context.
  • Classed Post Maps: Best for showing how sample locations fall into key categories or thresholds (e.g., contamination levels, soil types).
  • Vector Maps (Arrows, Quivers): Choose these for directional data, such as groundwater flow, wind direction, or current movement.
  • Volumetric or Isosurface Maps: Best for visualizing 3D plumes, ore bodies, or subsurface concentration zones that cannot be interpreted easily in 2D.

When to Use Different Graph Types

  • Line Graphs: The go-to for showing change over time, trends, or continuity between data points.
  • Scatter Plots: Use these to highlight relationships or correlations between two variables, which is great for exploratory analysis.
  • Bar Charts: Best for comparing discrete categories or summarizing grouped data.
  • Histograms: Ideal for understanding data distribution, spread, or skew—commonly used in QA/QC or statistical analysis.
  • Box Plots: Use this visual when comparing distributions across multiple groups; they quickly show medians, variability, and outliers.
  • Ternary Diagrams: Choose these when analyzing three-component systems, such as soil composition or geochemical ratios.
  • Stiff, Piper, or Durov Diagrams: These are essential for water chemistry interpretation, equipping you to compare ionic composition and classify water types.
  • Polar or Wind Rose Plots: Great for directional datasets where frequency and magnitude depend on angle (e.g., wind speed, flow direction).
  • Bubble Charts: Helpful when you need to show relationships among three variables (x, y, and bubble size) in a compact visual.

Ultimately, you have a lot of visual formats at your disposal, but choosing the right one is the first step toward ensuring you communicate your data-driven insights well.

Reduce Clutter So the Insight Stands Out

Overloaded visuals bury your message. When a map, model, or graph tries to show everything at once—too many layers, labels, symbols, or colors—the most important takeaways aren’t easily identifiable. To overcome that issue, prioritize the insights stakeholders need to see first, then strip away anything that doesn’t directly relate to those points. 

This cluttered chart shows Colorado and US income compared with Colorado and US home prices.

Example: Cluttered Chart

This clean, sleek chart shows Colorado and US income compared with Colorado and US home prices.

Example: De-Cluttered Chart

Use Color, Scale, and Symbology Intentionally

The wrong color scale can misrepresent gradients, hide subtle variation, or introduce bias. Likewise, poorly chosen contour intervals, axis limits, or symbol sizes can distort meaning.

Choose color scales that reflect the nature of your data, set scales that accurately represent values, and maintain consistent symbology across figures. When every stylistic choice supports clarity, stakeholders can immediately focus on the meaning behind your insights rather than the mechanics.

Guide Stakeholders With Clear Annotations and Context

Even strong visuals can fall flat if stakeholders don’t know what they’re viewing. That’s why annotations are great. Labels, callouts, and legends guide your audience through your visual so they understand the main takeaways. However, please remember that just enough context prevents confusion; too much creates noise. Use annotations where necessary to ensure stakeholders have the insights they need and don’t get bogged down by the details. 

Structure a Data Narrative That Leads to Decisions

Data presentation is storytelling. A strong narrative follows a logical arc: Question → Method → Insights. Your visuals should follow this same progression. Sequence it so each design element builds on the last, helping your audience connect steps and arrive at your insights naturally. When the story flows, decision-making becomes faster and more confident.

Know When 3D Adds Clarity and When It Doesn’t

3D visualization is an incredibly powerful tool, but like any tool, it’s most effective when used intentionally. Not every dataset or deliverable needs a 3D model, and using 3D at the wrong time can confuse your audience. The key is understanding when 3D genuinely enhances clarity and when a well-crafted 2D visual communicates the message more effectively.

Here are general guidelines for when each approach works best:

  • When 3D helps: Subsurface mapping, overlapping features, volume interpretation, and spatial relationships that can’t be seen clearly in 2D.
  • When 2D is better: Simple comparisons, regulatory deliverables, quick checks, or time-sensitive workflows.

Ultimately, using 3D should be a thoughtful decision to ensure you get the most benefits from it. Choosing the right dimensionality will support your ability to effectively communicate insights from your data.

Tailor Visuals to Your Audience

Different audiences bring different levels of technical expertise, priorities, and expectations to the table. A visual that works perfectly for an engineer may overwhelm a community stakeholder, and a simplified public-facing graphic may be too imprecise for a regulatory agency.

Understanding what each group needs to see is essential to ensuring your message lands. For example:

  • Clients and regulators typically need accuracy, analysis, and defensibility.
  • Internal teams may need more technical depth to collaborate effectively.
  • Public audiences often require simplified visuals without losing factual accuracy.

The strongest data presentations meet viewers where they are. They give each audience exactly what they need to understand the insights so they can make informed decisions.

What Leads to Great Real-World Decision-Making

Effective data presentation is about sharing insights in a way stakeholders can understand and act on. When your visuals achieve that goal, they bridge the gap between technical data collection and analysis and actionable understanding. That’s what transforms your work from “informative” to truly influential. So whether you choose a 2D map, a scientific diagram, or a 3D model, the goal is the same: help stakeholders understand the insights from your data. Mastering this takes practice, but the payoff is significant. 

Want to continue sharpening your data visualization skills and growing as a geoscientist and engineer? Subscribe to the Golden Software blog!

]]>
https://www.goldensoftware.com/impactful-data-presentation/feed/ 0 16590
How Better Project Documentation Makes You a More Effective Geoscientist https://www.goldensoftware.com/project-documentation-makes-you-effective/ https://www.goldensoftware.com/project-documentation-makes-you-effective/#respond Wed, 11 Feb 2026 18:43:22 +0000 https://www.goldensoftware.com/?p=16327
This image showcases Grapher's new in-document comments feature that you can use to leave notes on visuals, which is a great project documentation tactic.

How Better Project Documentation Makes You a More Effective Geoscientist

In geoscience, most professionals are laser-focused on data collection, analysis, and delivering high-quality final outputs, which means project documentation often falls to the bottom of the priority list. But weak documentation creates friction in every part of a workflow. When assumptions, decisions, and methods aren’t clearly recorded, even the most straightforward visuals become harder to reuse, update, collaborate on, or defend later.

That’s why strong project documentation is more than a formality. It’s a professional skill that helps geoscientists work faster, avoid unnecessary rework, collaborate more effectively, and have more productive conversations with stakeholders. But how can you strengthen your documentation in a way that genuinely improves your workflow? Let’s dive in to unpack the answer.

What Project Documentation Means in Geoscience

First, let’s get clear on what project documentation actually means in geoscience. Simply put, it’s the collection of notes, decisions, and records that explain how and why a project was created in the first place.

Strong documentation captures the context behind your workflow, providing the narrative that connects the dots between raw data and final output so your work can be reused, updated, collaborated on, or defended later. It answers key questions, including: 

  • Why did you choose a specific map, model, or graph?
  • How did you handle outliers?
  • What assumptions guided your design?
  • Which datasets were included or excluded?
  • What was your thought process behind key decisions?
  • What customizations did you make to your visualization? 

Just as important, project documentation doesn’t live in just one place. It can appear inside your project files, in clearly named folders, in notes attached to models or maps, or in supplemental materials that track methods and version history. Regardless of the format, the function is the same: preserving context for your workflow so it’s easy to reuse, update, collaborate on, or defend your work.

When you achieve that goal, your project becomes durable. If the project evolves, new data arrives, a team member wants to collaborate, or stakeholders ask questions, you aren’t forced to reconstruct your workflow from memory. Instead, you have a clear record that keeps the project moving forward smoothly.

What Happens When Documentation Is Weak? The Real Costs

Now, what happens when you don’t have strong project documentation? We’ve hinted at it already, but let’s discuss the consequences more plainly. Here’s the friction you’ll experience in your workflow when your project documentation isn’t strong.

Design Decisions Have No Explanation

One of the biggest problems with weak documentation is lost context. When the reasoning behind your choices isn’t recorded, future you—or anyone else touching the project—has no way of understanding why certain decisions were made. This can show up in a few ways:

  • No notes explaining how a visualization was designed or how it should be updated.
  • Missing justification for customization choices around color scales, symbology, filters, or gridding settings.
  • No record of parameters you tested and discarded, which makes it easy to repeat mistakes.

Without this context, even a polished graph, map, or model becomes difficult to reuse, update, collaborate on, or defend. The “what” is visible, but the “why” is lost, and that part matters when making data visualizations.

More Time Is Wasted

When your documentation is weak, the next phase of the project—like updating your visual with new data—often requires retracing your own steps instead of building on prior work. That slows momentum and forces you to investigate rather than focus on analysis. When that happens, three consequences appear: 

  • You spend too much time trying to remember how you produced a map, model, or graph.
  • Recreating workflows becomes guesswork instead of straightforward execution.
  • Long pauses between project stages lead to unnecessary restarts because the original process wasn’t documented.

Over the course of a project—or an entire career—these inefficiencies add up to hours of avoidable rework.

Collaboration Becomes Harder

Geoscience is collaborative by nature, but poor documentation makes it harder for teams to work together effectively. When teammates can’t follow your process, collaboration slows and feedback becomes less actionable. You’ll start to notice that: 

  • Colleagues struggle to understand how results were generated or what assumptions were used.
  • Reviews take longer because they require extra explanations.
  • Decision-making is delayed by repeated clarification cycles.

Weak documentation creates barriers where there should be alignment, and that affects project timelines as well as project quality.

Defensibility and Traceability Are Compromised

In technical and regulated industries, your work must be able to stand on its own. Without clear documentation, you lose the “paper trail” that connects your raw data to your final conclusions. This creates a two-fold risk:

  • Increased Misinterpretation: When workflows, assumptions, and datasets aren’t explicitly recorded, stakeholders are forced to guess. This leads to misread results, incorrect decisions, and a breakdown in trust between you and your audience.
  • Vanishing Accountability: If a regulator or client challenges a specific model or map, you must be able to show exactly how you got there. Weak documentation leaves you unable to justify your parameters or symbology, making your conclusions difficult to defend under scrutiny.

Ultimately, documentation is your professional insurance policy. Without it, even the most accurate analysis can be dismissed if it can’t be traced and verified.

Practical Best Practices for Creating Strong Project Documentation

So how can you build project documentation that makes it easy for you to reuse, update, collaborate on, or defend? Below are essential best practices that can help you document projects efficiently while improving long-term workflow quality.

1. Use Clear, Consistent File and Project Naming

Strong documentation starts with your digital file names. Clear naming conventions help you and others instantly identify the latest version, the correct dataset, or the appropriate visual without digging through folders or opening multiple files.  Consequently, it prevents confusion and speeds up workflow transitions.

For a concise, standardized naming structure, consider including:

  • Project or site name
  • Date or sampling round
  • Version number or milestone

This simple practice will eliminate ambiguity and ensure the right file is always easy to find.

2. Capture Decisions in Context Using In-Document Comments

One of the biggest challenges in long-term or seasonal projects is remembering exactly how you built a visual or the settings you used. The same problem arises when handing a project to a colleague who must decipher your filters, styling choices, or workflow from scratch.

You can solve this by leaving comments explaining the reasoning behind your choices inside the project file itself. These embedded notes will preserve the context behind your work, so you never have to retrace your steps, rely on memory months later, or create confusion among team members.

Consider creating comments to record:

  • Why a specific filter or setting was chosen
  • What a color scheme, axis range, or symbol style is meant to highlight
  • Instructions for collaborators on what to review, adjust, or replicate
  • Assumptions, limitations, or considerations for future updates

When comments live in the same document as the visualization, nothing gets lost—and both you and your team work faster.

This image showcases Grapher's new in-document comments feature that you can use to leave notes on visuals, which is a great project documentation tactic.

3. Track Versions Intentionally

Good version tracking prevents accidental overwrites, supports defensibility, and captures the evolution of your project. That said, instead of constantly overwriting files, save milestone versions as the project progresses. Each version will become its own snapshot, which can be useful for audits, regulatory review, team collaboration, or retracing earlier conclusions.

When tracking versions, helpful things to add include:

  • Brief notes on what changed and why
  • Distinct version numbers (v1, v2, v3) aligned with project checkpoints

Ultimately, intentional versioning reduces confusion and preserves a clean historical record.

4. Maintain a Simple Project Overview

A project overview acts as a roadmap for anyone who opens a file, including you if you open an old document months later. The short summary will provide essential context without requiring a deep dive into the full workflow.

A strong overview typically includes:

  • Project goals or questions being answered
  • Primary datasets used
  • Current status or progress
  • Any unresolved questions or known limitations

This quick reference will dramatically reduce onboarding time when either reopening a project after significant time has passed or handing it off to a colleague midstream.

5. Document Data Sources and Metadata

Data rarely speaks for itself. Recording details about where it came from, how it was processed, and its limitations makes your work transparent and repeatable. This is essential during QA/QC, regulatory reporting, peer review, or collaboration with technical teams.

Key metadata you should capture includes:

  • Data source, collection date, and sampling method
  • Coordinate systems and units
  • Processing steps, filters, or transformations applied
  • Known limitations or uncertainty

When metadata is clear, the credibility and defensibility of your work increase significantly.

6. Keep Assumptions and Limitations Explicit

All geoscientific interpretation involves assumptions. Failing to document them can lead to misinterpretation, miscommunication, or incorrect decisions down the road. So, writing down the why behind your decisions acknowledges uncertainty while strengthening trust in your outputs. 

Here are examples of assumptions to share:

  • Reasoning behind boundary interpretations
  • Areas of uncertainty or sparse data
  • Conditions you assumed to be constant

Clear assumptions enable clearer conversations with stakeholders and smoother technical review.

7. Align Documentation With Project Milestones

Documentation is easiest and most accurate when added gradually—not saved for the end of a project. Natural checkpoints such as sampling rounds, analysis, internal reviews, or draft-report stages are ideal moments to update notes. Providing context at each major milestone keeps documentation fresh and accurate.

8. Centralize Project Information

Documentation loses value when it’s scattered across personal folders, email attachments, and chat messages. Centralizing your data, notes, and outputs ensures everyone has access to the same information. Consider using a shared and consistent structure (team drive, SharePoint, secure cloud folder, etc.) to keep project knowledge accessible and prevent critical context from disappearing.

9. Standardize Documentation Where Possible

Consistency is one of the most powerful documentation tools. Standardizing your approach to naming, note-taking, summaries, data visualization, and version tracking removes friction and makes it easier to switch between projects and get up to speed. One easy way you can start standardizing documentation is to create checklists or standard operating procedures (SOPs) for everyone to follow. 

10. Review Documentation Before Final Deliverables

Before submitting a report, exporting visuals, or delivering results, take a moment to ensure your documentation reflects the current project state. This final review reduces last-minute confusion and protects your work from avoidable questions during stakeholder review. 

During the review, confirm the following:

  • Decisions and assumptions are clearly recorded
  • Metadata is complete
  • Version notes reflect the latest file
  • Instructions for future updates are accurate

This final check strengthens both the clarity and defensibility of your results.

The Benefits of Strong Project Documentation

Now, what can you expect to gain by putting these project documentation habits into practice? The impact is bigger—and more immediate—than you may initially think. Here are some advantages you’ll start noticing right away.

Faster Project Restarts With Less Frustration

When documentation preserves the “why” behind your decisions, picking a project back up becomes far easier. Instead of spending hours retracing your steps or reverse-engineering old settings, you can quickly reorient yourself and resume progress with confidence. This is especially valuable for seasonal work, long-running studies, or projects with lengthy pauses between phases. Clear documentation shortens the ramp-up time and gets you back to productive work sooner.

Clearer Collaboration and More Productive Feedback

Good documentation eliminates the guesswork that often slows collaboration. When teammates can clearly see your assumptions, decisions, and workflows, they don’t have to interpret your process; they can just build on it. This leads to faster reviews, sharper feedback, and fewer back-and-forth clarification cycles. Whether you’re working across departments or handing a project to a colleague, strong documentation keeps everyone aligned.

Greater Confidence When Explaining Results

When you document the reasoning behind your maps, models, and graphs, explaining results becomes easier and far more defensible. You can clearly articulate how an output was generated, why certain parameters were chosen, and what assumptions guided the analysis. That transparency strengthens trust with clients, regulators, project managers, and internal reviewers alike. It also reduces repetitive questions and skepticism during presentations or reporting.

This image shows a professional presenting their data visualizations to stakeholders.

Higher-Quality Final Deliverables

When comments, assumptions, limitations, and workflow notes are captured along the way, errors are less likely to slip into final outputs. Good documentation serves as a built-in QA/QC tool, helping you spot inconsistencies earlier and produce cleaner, more accurate maps, models, and graphs.

Improved Knowledge Retention Across Your Organization

Strong documentation not only helps the current project team but also strengthens institutional knowledge. Processes, lessons learned, and repeatable workflows become easier to preserve, share, and scale across other projects or departments. Over time, this builds a more efficient, informed, and capable team.

Document Smarter, Work Faster

Project documentation shouldn’t be a low priority. It’s one of the most practical ways to become a more effective, efficient, and trusted geoscientist. When your decisions, assumptions, and methods are clearly captured, every part of your workflow becomes easier. You can reuse, update, collaborate on, and defend your outputs with confidence and clarity. 

Even better? You don’t have to jump through hoops to start documenting projects well. You can begin with one of the simplest, highest-impact tools available: Grapher’s Document Comments. By recording notes directly inside your project file, you preserve the context that future-you—and your colleagues—need to work quickly and accurately. Want to try it for yourself! Download a free 14-day trial of Grapher and start incorporating strong project documentation into your workflow!

FAQ: Project Documentation

]]>
https://www.goldensoftware.com/project-documentation-makes-you-effective/feed/ 0 16327
How Geoscientists Can Clearly Communicate Insights from Their 3D Data Visualization https://www.goldensoftware.com/communicate-your-3d-data-visualization/ https://www.goldensoftware.com/communicate-your-3d-data-visualization/#respond Wed, 11 Feb 2026 16:41:58 +0000 https://www.goldensoftware.com/?p=16316
This 3D data visualization showcases contour slices and a contamination plume.

How Geoscientists Can Clearly Communicate Insights from Their 3D Data Visualization

3D data visualization has transformed the way geoscientists and engineers interpret and explain complex information. This format showcases depth and spatial relationships that would otherwise be difficult—if not impossible—to understand in 2D. But while 3D models can reveal underground features, surface and subsurface connections, and complex spatial relationships, there are still two elements that can take 3D data visualizations to the next level: motion and interactivity.

Humans don’t just understand space visually. They understand it through motion and interaction. People walk around objects to understand their shape, tilt their heads to gauge depth, and shift their perspective to see how features relate. That’s why adding motion and interactivity to your 3D data visualizations is key to making them even more intuitive and impactful for stakeholders. 

Why Complex Geoscience Data Often Needs Movement to Be Understood

What does it mean to add motion and interactivity to your 3D data visualization? It means providing the opportunity to move, rotate, zoom, engage with, or shift perspectives so stakeholders can infer depth, scale, and spatial relationships even more than before.

Geoscience and engineering data are inherently dynamic. Subsurface layers dip and fold. Plumes migrate. Geological contacts intersect in ways that only make sense when seen from multiple angles. A static view can’t capture all of that effectively, which is why motion and interactivity are indispensable.

Motion equips stakeholders to explore depth in a natural way. Rotating a 3D model, for example, exposes features that would otherwise stay hidden behind surfaces. Zooming in and out helps grasp true scale, especially in datasets where dimensions span hundreds of meters vertically and horizontally. And navigating through 3D visualizations gives stakeholders the same perspective they would experience in the real world.

Interactivity also reduces the need for lengthy explanations. For instance, instead of describing the concentration levels of a plume, you can let the visualization do the communicating. When stakeholders can manipulate the 3D model themselves—by turning layers on and off—they can understand the situation faster and more intuitively, and that leads to clearer discussions and better decisions.

Ways to Incorporate Motion and Interactivity Into 3D Data Visualizations

Bringing motion and interactivity into your 3D data visualizations doesn’t require overhauling your workflow; it simply requires incorporating tools that let your visualizations behave more like real environments. Below are some of the most effective ways to add movement and interactivity to deepen understanding among your stakeholders.

Walking on 3D Model: Helping Viewers Explore the Data Themselves

Walking on a 3D model gives stakeholders the opportunity to navigate your insights at their own pace, moving through results as if they were walking across the project area. This freedom to explore equips stakeholders to focus on the areas most relevant to them—whether that’s a potential drilling location, a suspected contamination zone, or a region of structural complexity. Instead of relying solely on your chosen viewpoints, stakeholders can inspect the project area from angles that help them build their own understanding. As a result, they’re more likely to grasp subtle relationships, like how a proposed site sits relative to slopes or drainage. 

Fly-Throughs: Showing the Big Picture First

Fly-throughs are an excellent way to set the stage. By providing a pre-recorded tour of your 3D model, fly-throughs can establish spatial context early. This can help stakeholders understand where key features sit relative to each other—like watershed boundaries, elevation changes, infrastructure locations, or regional geological patterns—before the conversation shifts into more granular analysis.

Also, because fly-throughs are pre-recorded, they offer a structured narrative that gently leads stakeholders through the insights you want to share. They prevent information overload by introducing complexity in a gradual, intentional way. Instead of overwhelming stakeholders with a model they must explore on their own, fly-throughs equip you to curate the experience, ensuring everyone receives the same clear orientation before deeper discussion begins.

Video: Making Complex Insights Easy to Share

Videos are one of the most accessible formats for communicating 3D insights, especially when sharing results with audiences who may not have the software or technical background needed to explore a model interactively. A well-made video equips stakeholders to absorb the information without navigating any tools themselves, removing barriers and ensuring consistent understanding across the group.

Repeatability is another major advantage. Unlike a live demonstration, a video can be replayed, paused, and reviewed as many times as needed—whether during internal discussions, stakeholder meetings, or regulatory reviews. This makes videos incredibly powerful for communicating complex insights.

Animations: Revealing Patterns Over Space and Time

Animations help show progression, movement, or relationships as they unfold. Whether you’re visualizing plume migration, changes in groundwater elevation, or the evolution of terrain, motion reveals trends that would otherwise remain hidden. The key advantage of animations is that they control the pace of information delivery. Smooth transitions highlight what’s changing—and what isn’t—helping your stakeholders detect relationships that could be missed when comparing individual snapshots. 

Exporting to 3D PDF: Giving Stakeholders Lightweight, Click-and-Explore Interactivity

Exporting your model to a 3D PDF will lead to an intuitive file that anyone with Adobe Reader can open, explore, and use to understand key insights. With a 3D PDF, stakeholders can rotate, zoom, and pan the model. Most importantly, they can toggle layers on and off, a powerful way to help them focus on the insights that matter most. For example, a regulator reviewing a remediation plan may want to isolate the plume layer, then add stratigraphy, then add monitoring wells to understand spatial relationships step by step.

Best Practices for Using Motion and Interactivity Effectively

Even if you add motion and interactivity to your 3D data visualization, you need to do it well for it to be effective. Thoughtful design ensures stakeholders quickly grasp insights instead of getting lost in movement or overwhelmed by details. That said, here are three key best practices to ensure your 3D models are clear as you add motion and interactivity.

Keep motion intentional and purposeful

Every rotation, zoom, or camera path should serve a purpose. Ask yourself: What insight does this movement reveal that a static view cannot? If the motion doesn’t help clarify a relationship, show a transition, or draw attention to something important, it’s most likely unnecessary. Only purpose-driven movement is recommended.

Control pacing so viewers can follow the story

Even the best visuals lose impact when they move too quickly. Slow, deliberate pacing allows stakeholders to absorb details, understand spatial context, and connect what they’re seeing to what you’re explaining. Whether you’re creating a fly-through or animating a plume boundary over time, err on the side of clarity over speed.

Avoid unnecessary complexity that distracts from the message

Too much movement, too many layers, or overly dramatic transitions can work against you. If stakeholders are dazzled by the animation but unclear about the insight, the visualization has failed its purpose. Keep the interactivity simple wherever possible and reserve motion for the moments that truly elevate understanding.

Helping Your 3D Data Visualizations Speak for Themselves

By adding motion and interactivity—whether through walking on a 3D model, fly-throughs, animations, videos, or 3D PDFs—you give stakeholders the opportunity to experience your data rather than simply view it. That shift transforms complex geologic structures, subsurface features, and spatial relationships into something intuitive, memorable, and far easier to act on.

But ultimately, the goal shouldn’t be to create movement for movement’s sake. You need to ensure you’re following best practices so your model is clear. Take intentional and thoughtful steps to highlight the insights that matter most. When motion and interactivity is used purposefully, your 3D visualizations become powerful communication tools that build understanding, empowering better decisions among stakeholders.

Ready to take your 3D communication to the next level? Explore how motion tools in Surfer can help you present your work with more clarity, impact, and confidence. You can try Surfer free for 14 days and experience firsthand how dynamic 3D visualization transforms understanding.

]]>
https://www.goldensoftware.com/communicate-your-3d-data-visualization/feed/ 0 16316 Data Visualization | Golden Software nonadult
The Right-Fit Approach: How to Know If a Software Solution Is the Perfect Match https://www.goldensoftware.com/the-right-fit-approach/ https://www.goldensoftware.com/the-right-fit-approach/#respond Wed, 04 Feb 2026 19:10:55 +0000 https://www.goldensoftware.com/?p=16299
This image shows a professional who's found a great software using the right-fit approach.

The Right-Fit Approach: How to Know If a Software Solution Is the Perfect Match

In geoscience and engineering, there’s no shortage of software promising to make your work faster, smarter, and easier. From data visualization to modeling and analysis, the options are endless, and that’s both a benefit and a challenge. With so many tools available, there’s bound to be one that’s great for you, but how do you select the right one?

Simply put, there’s no one-size-fits-all software solution. Because every project is different, most professionals need a toolbox of solutions that work together to cover different needs, from gridding and 3D modeling to graphing and reporting. The challenge is figuring out which tools deserve a spot in your toolbox. That’s where the right-fit approach comes in.

Why the Right Software Choice Matters

The right-fit approach is all about choosing software that helps you do what you need to do. Instead of chasing industry trends or flashy features, it’s about identifying solutions missing in your toolbox and that align with your specific goals, workflow, and deliverables. When you check these boxes, it reduces friction. Every click, import, analysis phase, and visualization step flows more smoothly, and that kind of ease directly translates into better results. You’ll experience greater efficiency and the ability to produce clearer visuals and reports that impress stakeholders.

The Right-Fit Criteria: What to Look for in Your Next Software Solution

So how can you practically determine whether a software solution belongs in your toolbox? There are ten key benefits you should seek if you want to have confidence that a tool is truly the right fit.

1. Easy Access to Helpful, Knowledgeable Customer Support

When you’re on a deadline and something doesn’t work as expected, responsive and knowledgeable support can make all the difference. Look for software backed by a team that understands geoscience and engineering workflows—not just generic tech support. Talking to people who grasp your project needs and deliver insights quickly builds confidence and ensures you get real solutions fast.

2. An Intuitive Interface and Thoughtful User Experience

You shouldn’t have to spend hours looking for certain features in your software to get results. The right tool is user-friendly, reducing trial and error so you can spend time visualizing data, not figuring out which button to press. An intuitive design will benefit you whether you’re a new user learning the ropes or a seasoned user working against tight timelines.

3. Comprehensive Tools for 2D and 3D Mapping

In geoscience and engineering, flexibility matters. Some projects are best visualized in 2D, while others demand the added depth of 3D. Choose software that supports both, so you can use the right dimensionality at the right time. Keep in mind that the best tools make 3D visualization easy to tackle, not overwhelming.

4. Versatile Gridding Capabilities You Can Trust

Accurate gridding is the foundation of meaningful interpretation. Software should give you full control over gridding methods and parameters while delivering reliable, repeatable results. When your grid is accurate and realistic, your confidence in every contour, cross section, and model will grow, so consider using a tool with superior gridding capabilities.

5. Ability to Integrate With Other Tools and Data Sources

No software exists in isolation. You need tools that can import, export, and share data seamlessly with other platforms in your toolbox. Whether you’re pulling in LiDAR data, working with CAD files, or exporting to another visualization solution, the right software fits into your existing ecosystem instead of forcing you to rebuild a map, model, graph, or chart.

6. An Effective Onboarding Process for a Smooth Learning Curve

Powerful software only helps if you can actually use it. That’s why good onboarding—complete with documentation, tutorials, and guided examples—is so important. It reduces frustration, shortens the learning curve, and gets you from installation to insight faster. When you evaluate tools, finding one that comes with a great onboarding experience is critical.

7. Customization and Control Over Outputs

Every dataset tells a unique story, and you should have the power to present it clearly. The right software equips you to easily finetune colors, scales, and layouts to meet both technical and stakeholder expectations. Whether you’re preparing a report, presentation, or publication, full control over your designs will ensure visuals match necessary standards.

8. Scalability as Projects and Data Grow

Your projects evolve, and your software should be ready to evolve with them. Look for tools that can handle increasingly complex datasets and workflows without compromising performance. Scalability ensures long-term value and keeps your toolbox future-ready.

9. Clear Documentation and Learning Resources

Even experts need a refresher now and then. Searchable articles, tutorials, webinars, and real-world examples make it easy to learn, troubleshoot, and grow your skills independently. Self-service resources also save time and reduce reliance on support teams. If a solution doesn’t come with the opportunity to enhance your data visualization skills on your own time, it may not be a long-term option. 

10. A Track Record of Reliability and Ongoing Development

Software isn’t static, and neither are your needs. Choose tools backed by teams committed to regular updates and innovation. Active development and consistent reliability show that a solution isn’t just built for today’s work, but for tomorrow’s challenges too. And that’s what you need if you want to continue helping stakeholders solve complex problems around the world. 

How to Apply the Right-Fit Approach When Evaluating Software

Knowing what to look for is one thing, but knowing how to test for it is where the right-fit approach really comes to life. The best way to determine if a software solution is the right fit for you isn’t just by comparing feature lists or scrolling through product pages, but by experiencing how it performs with your data and your goals in mind. That said, here are different ways to evaluate software to know if it’s worth adding to your toolbox.

Start With Hands-On Evaluation

Free trials and demos exist for a reason: to help you know if a solution is the best choice for you. Instead of just reading about what a software can do, test how it actually handles your project data and everyday tasks. This will give you a true sense of whether the tool complements your workflow or adds friction to it.

Ask Questions and Gauge the Response

A company’s support team can tell you a lot about what it’s like to use their product long-term, so ask questions during any free trial or demo. Does the team respond quickly? Are they knowledgeable about your field? Do they offer helpful guidance and suggestions? Getting direct insight from real people helps you gauge not only the software’s technical fit but also the reliability of the team behind it.

Explore the Learning Resources

Even intuitive software requires a little ramp-up. Take a look at the company’s blogs, tutorials, documentation, and webinars. Are they easy to find and follow? A strong library of learning resources signals that the company invests in its users’ success and will support your growth over time.

Pay Attention to How It Feels

Finally, go with your instincts. The right fit doesn’t just meet technical needs, but it also feels right to use. If you find yourself moving smoothly through tasks, getting answers fast, and feeling confident in your output, that’s a sign the software fits your specific needs.

Finding the Software That Fits You

At the end of the day, the right-fit approach is about choosing software for your toolbox that genuinely supports your unique needs. When a software aligns with your goals, workflows, and deliverables, everything becomes easier, leading to increased efficiency and the power to create final outputs that impress stakeholders. So, as you evaluate your next solution, take time to identify important benefits to ensure you choose the right platform for you. When the software in your toolbox truly fits your needs, you’ll set yourself up for long-term success.

Want more best practices on how to approach data visualization? Subscribe to the blog now!

]]>
https://www.goldensoftware.com/the-right-fit-approach/feed/ 0 16299
How to Know When Your Project Calls for 3D Visualization https://www.goldensoftware.com/your-project-calls-for-3d-visualization/ https://www.goldensoftware.com/your-project-calls-for-3d-visualization/#respond Wed, 04 Feb 2026 17:35:05 +0000 https://www.goldensoftware.com/?p=16282
This 3D visualization showcases subsurface layers and drillholes.

How to Know When Your Project Calls for 3D Visualization

In geoscience and engineering, there’s no one-size-fits-all approach to visualization. Sometimes, a well-made 2D map communicates your data perfectly. Other times, it only scratches the surface.

The truth is, 2D has an important place in your workflow. It’s fast, efficient, and sometimes all you need to communicate insights. But the challenge isn’t whether to use 2D; it’s knowing when you shouldn’t, so you can pivot to 3D. 

Many professionals rely on 2D longer than they should, missing the opportunity to use 3D visualization where it truly shines. That’s why this blog is exploring when 3D adds real value, how to recognize those moments in your projects, and why using it at the right time can make you more effective in your work.

When 2D Performs & Fails

Before diving into 3D, let’s start with when you should use 2D.

In many cases, 2D is the simplest and most effective choice. When you’re visualizing a single variable—like elevation, temperature, or concentration—across a relatively uniform surface, a contour or heat map often gives you all the clarity you need. It’s fast to produce, easy to interpret, and ideal for quick comparisons or report-ready visuals.

2D also shines when your goal is straightforward communication. If stakeholders only need to see trends at a glance, or if spatial depth doesn’t meaningfully change the interpretation, 2D keeps the message clear without adding unnecessary complexity. 

That said, the limitations of 2D become clear once projects move beyond the surface. When your data starts spanning multiple layers or complex subsurface features, 2D maps can flatten critical context, making it harder to fully understand what’s really happening. And that’s where 3D starts to make all the difference.

Signs It’s Time to Move From 2D to 3D

When depth, interaction, or scale become important, 3D visualization will make the complex information significantly clearer than a 2D map. But to help you recognize when those moments arrive, here are five signs it’s time to move beyond 2D and start visualizing your data in 3D.

Sign #1: You’re Working With the Subsurface

Whenever your project extends below the surface—whether you’re mapping contamination plumes, tracking groundwater flow, or characterizing ore bodies—2D cross sections can only take you so far. Each view shows part of the story, but not how layers or features connect across space.

For example, imagine trying to trace a contaminant plume through multiple cross sections. You might be able to guess its shape or direction, but without 3D, you’re essentially piecing together a puzzle one slice at a time. A 3D model, however, lets you see the plume’s true geometry, including its volume, spread, and relation to wells or structure, so you can interpret and act with confidence.

Sign #2: Multiple Datasets Need To Be Interpreted Together

Many projects require bringing together different datasets like drillholes, geophysical grids, and modeled surfaces. With 2D, you can communicate these insights by creating separate maps or trying to put everything onto a single map, but both options produce unnecessary complexity.

Take, for example, a mining project where drillhole data, magnetic surveys, and surface mapping all overlap. In 2D, you might overlay these datasets or separate them onto their own maps, but relationships between depth, grade, and structure would become complicated and unclear or disjointed. With 3D, those same datasets could come together in a cohesive model, revealing patterns—like a mineralized zone’s true extent—that are impossible to easily detect otherwise.

Sign #3: You’re Making or Supporting High-Impact Decisions

When your project outcomes influence cost, safety, or environmental risk, visual clarity becomes non-negotiable. Fortunately, 3D provides it so you can reduce uncertainty. For instance, if an environmental consultant is siting new monitoring wells, 3D visualization helps confirm that proposed locations intersect the plume’s flow path, not its edge. That clarity minimizes guesswork, supports better recommendations, and helps stakeholders make faster, more confident decisions.

Sign #4: You Need To Communicate Findings to Non-Technical Audiences

Even the most detailed report can fall flat if your audience can’t visualize what you’re describing. Whether you’re presenting to regulators, clients, or community stakeholders, 3D helps bridge the gap between technical data and intuitive understanding.

For example, instead of explaining a remediation plan using a stack of cross sections, imagine showing a 3D model where the plume, wells, and infrastructure are all spatially visible. In that scenario, stakeholders can immediately see what’s happening and why it matters, reducing explanation time and increasing buy-in.

Sign #5: You’re Iterating, Updating, or Testing Scenarios

Environmental and engineering projects evolve constantly as new data comes in. With 2D, incorporating updates can mean recreating multiple maps or redrawing cross sections just to see how conditions have changed.

With 3D, updates happen in context. Say you add new borehole data to your model. Suddenly, you can visualize how the plume boundary shifts, how the groundwater gradient changes, or how structural interpretations evolve. That adaptability keeps your project agile and your insights accurate.

The Hallmark of Great Geoscience and Engineering

Strategic 3D visualization use demonstrates mastery of your craft. It shows you know when a 2D map is sufficient and when your data demands more depth. That’s why you should consider using 3D when the opportunity arises. It’ll help you communicate complex insights effectively but also ensure stakeholders clearly see and understand your data, leading to stronger, more informed decision-making. And that’s the hallmark of great geoscience and engineering work: visualizations that don’t just display information but also drive real-world results.

Want more tips to enhance your data visualizations to sharpen your craft and impress stakeholders? Subscribe to the blog now!

]]>
https://www.goldensoftware.com/your-project-calls-for-3d-visualization/feed/ 0 16282
3D Models Sound Cool—But Are They Really Worth Creating? https://www.goldensoftware.com/3d-models-sound-cool/ https://www.goldensoftware.com/3d-models-sound-cool/#respond Tue, 27 Jan 2026 17:55:46 +0000 https://www.goldensoftware.com/?p=16257
This is a 3D model showcasing a satellite image, contamination plume, and drillholes.

3D Models Sound Cool—But Are They Really Worth Creating?

3D modeling has become a buzzword across the geoscience and engineering community, and for good reason. From effectively visualizing subsurface features to clearly communicating project results to stakeholders, 3D models offer a level of precision and understanding that flat visuals simply can’t match. However, it’s still fair to wonder: is creating a 3D model truly worth it? In our opinion, it’s absolutely worth it.

Why Many Hesitate to Create 3D Models

Before highlighting why you should create 3D models, let’s explain why it’s hard to even entertain the thought. For many geoscientists and engineers, the idea of creating a 3D visual sounds great in theory—but in practice, it often feels unnecessary or even intimidating. There are two specific reasons for this, such as the ones below.

You don’t see any clear benefits

When 3D modeling isn’t explicitly required by stakeholders, it’s easy to question its value. If your 2D map will communicate insights, creating a 3D model can feel like extra effort without a clear reward. From that perspective, 2D works just fine. Why invest extra time building something that doesn’t appear to move the project forward right now? That’s especially true when you’re juggling multiple deadlines. When time is limited, it’s natural to prioritize what’s proven, familiar, and fast.

You don’t want to learn another tool

Even though certain 3D modeling tools are more intuitive than ever, learning something new can still feel like another task on a long to-do list. Consequently, you may explore 3D tools but stop short of truly giving it a chance simply because you don’t want to add something to your task list when you already have many client projects to complete.

The Benefits That Make a Difference

While the reasons for avoiding 3D are understandable, the value of this visualization is enough to seriously consider designing one. To give you more context, here’s how 3D models change the game for geoscientists and engineers.

Deeper insight into the surface and subsurface

Visualizing both surface and subsurface data is inherently complex. In 2D, it often requires extensive explanation, as stakeholders try to piece together insights from multiple maps or interpret crowded information on a single visual. A 3D model removes that friction by providing a clear, spatially accurate picture of what’s happening above and below the surface. For example, you can show how a contamination plume extends beneath buildings or infrastructure, or how subsurface features align with proposed drilling targets, making spatial relationships immediately clear without extra explanation.

This 3D model incorporates different types of data to provide a realistic look at the surface and subsurface.

A clearer understanding that drives better conversations

3D models bring your data to life. Instead of asking stakeholders to interpret flat layers and contour lines, you’re giving them a realistic, spatially accurate view of the environment you’re studying. This makes it easier for both technical and non-technical audiences to intuitively understand what’s happening and why it matters.

Because the data is easier to grasp, conversations improve, too. Stakeholders can explore the model, ask better questions, and engage in more meaningful discussions. Rather than asking for explanations about your 2D map, everyone is looking at the same clear, cohesive visual, which leads to productive conversations that drive decisions.

This 3D model highlights various subsurface layers and drillhole paths.

Integration that connects every data source

Most water resource, environmental, and geotechnical projects rely on data from multiple sources. A 3D model can bring all those pieces together into one coherent view. That way, you can pull insights from your data more easily than if you were analyzing everything on a 2D map. This integrated perspective makes it easier to spot gaps, inconsistencies, and meaningful connections across datasets. 

3D models should focus on highlighting critical data points like this one does. This visual puts the spotlight on the contamination in the subsurface.

Quantifiable results that drive real-world impact

Beyond visualization, 3D models allow for true volumetric analysis. You can measure things like ore reserves, aquifer storage capacity, or contaminant volume, helping you estimate resources, assess project feasibility, or calculate remediation needs with precision. It’s not just about seeing your data but also quantifying it accurately to support better planning and cost decisions.

This is a 3D volume render created in Surfer.

The Benefits of 3D Models in Action

The advantages of creating 3D models are clear and also easy to see in every corner of geoscience and engineering. Whether you’re uncovering history, managing groundwater, or planning infrastructure, 3D visualizations can bring value to your work. Here are just some of the most impactful ways 3D models can shape projects in various industries.

Archaeology: Unveiling the past for smarter excavation

By combining ground-penetrating radar (GPR) or magnetometry data with a digital elevation model (DEM), archaeologists can generate a 3D visualization of subsurface features like foundation walls, ditches, and hearths. This equips teams to see the layout and depth of archaeological sites before breaking ground so they can target excavations precisely, reduce cost and time, and preserve fragile artifacts that might otherwise be lost.

Environmental science: communicating risk and guiding remediation

Environmental scientists can transform borehole and chemical sampling data into a 3D plume model showing contaminant concentrations through subsurface layers. By overlaying this with satellite imagery, they can design a clear picture of where contamination is concentrated and how it’s spreading. This can help regulators, engineers, and community stakeholders instantly grasp the scope of contamination, leading to faster approvals, clearer communication, and more strategic placement of monitoring or recovery wells for effective remediation.

Engineering: improving site safety and design communication

Geotechnical engineers can integrate borehole logs, seismic refraction data, and cross-sections into a 3D model of the subsurface. By creating this model, they can highlight layers of soil and rock, groundwater zones, and structural weaknesses that could affect foundations or tunnels. This model could help both engineers and clients quickly visualize potential hazards or stability concerns, leading to more informed site selection and design adjustments before construction begins.

Water resources: understanding aquifer geometry and vulnerability

Hydrogeologists can use well log and geophysical data to build 3D representations of aquifers, showing where groundwater is stored and how it moves through the system. With this visualization, water resource managers can easily identify recharge zones, discharge areas, and areas at risk of contamination. This can support better management decisions, such as setting pumping limits or designing effective protection zones, and help communicate aquifer vulnerability to policymakers and the public.

Resource exploration: interpreting geology for smarter development

Exploration geologists can construct 3D models of ore deposits or reservoirs by interpolating core samples and geophysical survey data. They can then overlay drilling data to assess which zones have the highest economic potential. With the ability to view the resource from any angle, geologists can refine their interpretations, reduce unnecessary drilling, and optimize extraction strategies, saving time, minimizing costs, and maximizing resource yield.

Seeing Is Believing: 3D Modeling Is Absolutely Worth It

3D modeling is an easier way to communicate and understand data. By revealing depth, relationships, and spatial context that 2D maps can’t capture, 3D models help you give stakeholders clearer insights and stronger confidence in their decisions. They also help you connect complex datasets, uncover hidden patterns, and present findings in ways that everyone can understand. 

So if you’ve ever questioned whether creating a 3D model is truly worth it, the answer is simple: it is. The impact it brings to your work makes it one of the most valuable tools you can use. Want to explore Surfer’s 3D tools to bring your data to life? Download the 14-day free trial of Surfer!

]]>
https://www.goldensoftware.com/3d-models-sound-cool/feed/ 0 16257
Behind the Grids: How Kriging Helped Map Alberta’s Groundwater and Geology https://www.goldensoftware.com/how-kriging-helped-map-alberta/ https://www.goldensoftware.com/how-kriging-helped-map-alberta/#respond Tue, 06 Jan 2026 19:23:32 +0000 https://www.goldensoftware.com/?p=16129
This image shows Roger Clissold, Founder and Principal Hydrogeologist at Hydrogeological Consultants Ltd.

Behind the Grids: How Kriging Helped Map Alberta’s Groundwater and Geology

Roger Clissold, Founder and Principal Hydrogeologist at Hydrogeological Consultants Ltd. (HCL), has a long history of turning complex, sparse water well data into usable, public resources. And for decades, his firm has relied on the gridding capabilities of Surfer to tackle massive, province-wide projects—most notably, mapping the groundwater resources for  approximately 40% of Alberta, Canada, an area that’s around 277,000 square kilometers, or about the size of a state.

A Long History of Gridding Innovation

HCL’s relationship with Surfer began in the early 1980s. A key moment was when one of their employees returned from a groundwater modeling course in Denver, Colorado, bringing back essential knowledge of the program. The only issue? Surfer didn’t have Kriging. Roger and his team eagerly awaited the implementation of the gridding method, and, to the credit of Golden Software’s product team, Kriging was available within a couple of months, making Surfer the go-to software at HCL. 

“At that particular time, we were paying around $300 for a Surfer license,” Roger recalled. “The only other program that was available for us to do gridding using the Kriging method was over $100,000. It was hugely expensive. But with Surfer, we had a $300 program, and we just came to love the software. We’ve worked with Surfer forever.”

The Massive Project: Mapping Alberta’s Groundwater

It was the unique, cost-effective power of Surfer’s gridding capabilities that equipped HCL to take on its most significant and defining project: mapping the groundwater resources across the entire “white area” of Alberta, which refers to the settled, populated portion of the province as opposed to the forested “green area” of the province.

The project began with the federal government’s involvement through the Prairie Farm and Rehabilitation Administration (PFRA). PFRA provided some of the funds for local counties to hire HCL to map the province’s groundwater resources. The goal was to transform data—which had primarily been a paper system—into a searchable, accessible digital system (e.g. GIS) and deliver a product to the public that would answer essential questions about their land, including expected groundwater supply and related elements like: 

  • Drilling depth required
  • Volume of groundwater available
  • Quality of the groundwater

To accomplish this, Roger and his team built an incredibly massive database. While the provincial government required water well drillers to provide basic data, HCL created its own database, which was, and still is,  constantly updated with new information, including extensive groundwater chemistry and field-verification data.

Additionally, the government funds paved the way for the precise positioning of thousands of water wells, improving the spatial control from a general quarter-section (800 meters by 800 meters area) to a specific set of coordinates. With this better spatial control, HCL could obtain accurate elevation data, which made their final results far more meaningful for the end-user.

Surfer Grids: The Core of the Solution

While gathering data for many of Alberta’s counties, HCL gridded the information using Surfer and built a query system to make their database easier to navigate. That way, an end-user could simply type in their coordinates and receive an instant report detailing the expected aquifer interval, groundwater volume, groundwater quality, and initially even the estimated cost to drill a water well. 

On top of that, HCL was also asked to standardize Alberta’s diverse geological nomenclature. Historically, different areas of the province had used different names for the same formations, leading to confusion. For example, the “Viking Formation” in the northern part of the province and the “Bow Island Formation” in the southern part of  the province were one and the same. 

“The federal government asked us, ‘Can’t you make this more uniform across the whole province?’” Roger said. “So, we ended up mapping the geology of all of Alberta, and we gridded everything with Surfer.”

On HCL’s website, you can see all the reports they did for their province-wide project with the federal government. One example that Roger highlighted is the Cardston County Report, which includes texts and many visuals. 

“The index map is the topography,” Roger said when briefly walking through the report. “We have one grid for the entire province of Alberta, but these individual areas in the province that we gridded [like Cardston County] are how it all started. The second map in the report shows the locations of water wells and springs. All of those were gridded with Surfer. We didn’t use anything else. But the whole functionality that made the visuals really useful was the fact that we could build a query to query these grids.”

The Kriging Method: Art and Science

The remarkable ability to build a query system that pulls critical hydrogeological information was entirely dependent on having good grids, which for HCL were always created using the Kriging method in Surfer.

For Roger and his team, Kriging was widely regarded as the best way to grid highly variable hydrogeological data because it provided the closest result possible to reality. Roger noted that at an International Association of Hydrogeologists (IAH) conference in 1995, Surfer was essentially the only program being used by hydrogeologists from many countries around the world, a testament to its “phenomenal penetration of the hydrogeological market” due to its effective features, including its gridding capabilities. However, Roger still offers a vital warning that scientists should keep in mind: “Kriging is great, but it isn’t reality.”

Roger explained that Alberta’s hydrogeology is highly variable, largely due to fracturing caused by the ground rebounding because of continental ice sheets melting and the westward movement of the  Rocky Mountains caused by continental drift. You can move just one meter     and find a totally different water well yield. While Kriging did an excellent job of interpolating the data between known water well points, it still gave an estimation. That didn’t mean it was untrustworthy. It just meant the gridding method didn’t show reality, and Roger kept that in mind, even though he thought Kriging did a great job overall. 

“Part of the work we do is trying to identify these fractures from surficial features, and that’s got a lot of art in it,” Roger said. “We still don’t have much science behind it. It’s more art than science. But generally, if you have 10 water wells in an immediate area, they probably reflect better what’s going on. The highest yield will be on those fractures. But off those fractures, where most water wells are completed, the data that we have from the surrounding water wells are usually the best indicators. So, what you have is this value here and here, then Kriging takes over, and it does a great job. It really is an excellent method of gridding.”

The Data That Powers Global Decisions

The work Roger and his team accomplished shows how powerful tools can solve large-scale problems. By leveraging Surfer’s Kriging capabilities, HCL transformed scattered, raw data into a uniform, queryable resource for an entire province. As a result, they not only delivered a better understanding of Alberta’s hydrogeology but also provided a publicly available tool that empowers the community to make informed decisions about one of their most vital resources: groundwater.

Want more stories like this one? Subscribe to our blog so you never miss an update!

]]>
https://www.goldensoftware.com/how-kriging-helped-map-alberta/feed/ 0 16129
Turning Subsurface Data Into Opportunity: How a Geologist Uses Software & Expertise to Guide Drilling Decisions https://www.goldensoftware.com/subsurface-data-to-guide-drilling-decisions/ https://www.goldensoftware.com/subsurface-data-to-guide-drilling-decisions/#respond Thu, 04 Dec 2025 15:05:30 +0000 https://www.goldensoftware.com/?p=15830
This Surfer-generated visual is a net pay map in meters, which can be used for evaluating reservoir potential.

Turning Subsurface Data Into Opportunity: How a Geologist Uses Software & Expertise to Guide Drilling Decisions

John Andersen is a geology consultant and Senior Geologist with O’Chiese Energy LP in Calgary, a city in Alberta, Canada. Working with the O’Chiese First Nation, which owns a 53-section block of land containing some of the largest natural gas pools in Western Canada, Andersen’s goal is to maximize the resource wealth for the Nation. Part of achieving this requires identifying new drilling opportunities that the First Nation can pursue for future growth to enhance their portfolio.

For Andersen, this work is impossible without precise, reliable subsurface mapping. He relies on the powerful capabilities of Surfer—using it as an essential add-on to his geological data platform, GeoScout—to transform vast, complex data into clear, actionable visuals.

The Evolution of Geological Mapping: From Microfiche to Digital Speed

Andersen’s career spans the evolution of geological data management, giving him a unique perspective on the power of modern tools. He recalls a time when creating a single map was an arduous, manual process.

“Typically, when I started in my career, we always hand-contoured all the mapping,” Andersen said. “We had to go in and create the data, pick the data, and then map it.”

Before the internet or digitized records, data storage was entirely physical. Geologists like Andersen relied on microfiche and paper printouts. 

“If you go back in history, we had to actually create stuff from microfiche, look at it on the microfiche, and then make a note and save that on a map because there was no electronic media,” Andersen explained.

The shift to digitized data, spreadsheets, and eventually interactive 2D software like GeoScout pushed the industry forward by making it easier to store data and map it. And today, the addition of user-friendly mapping software like Surfer has improved efficiency and the visual design of maps significantly.

“The biggest success was increasing my efficiency to be able to put things out faster,” Andersen said. “Also, a computer can draw a nice smooth curve better than my pencil. So, it’s a prettier, more presentable product at the end of the day.”

The Mapping Process: Turning Raw Data into Subsurface Insight

To create a high-quality final product, Andersen’s process begins with raw field data, which includes seismic and stratigraphic measurements used to determine the exact location and depth of hydrocarbon reserves. This data is the foundation for his projects and leads to the following workflow:

  • Data Acquisition and Input: The raw data is fed into GeoScout, and Andersen explained that having good data is critical. Surfer (which integrates with GeoScout) can only use the data you give it. If you’ve got bad data, you have to understand how to clean it accordingly. 
  • Structural Contouring: Surfer generates the subsurface structural contour map. This map clearly outlines the O’Chiese Indian Reserve boundary, with a grid reflecting the original imperial-based survey (one-mile squares), even though Canada is now metric.
  • Visualization and Layering: The map is then layered with well data. Different colors and symbols are used to distinguish horizontal wells targeting various hydrocarbon zones. 
  • Strategic Focus: By layering and isolating data points, Andersen uses the final map to determine optimal drilling locations. The goal is to decide where to drill more wells, leading to more financial opportunities in new areas.

“If you’re just coming into a new area you’ve never worked before, Surfer will create a map, and then you’ll have a much better handle on how to approach things in a proactive manner without literally thinking, ‘where do I start?'” Andersen explained.

The Geologist’s Value: Interpreting the Curves

While Andersen relies on user-friendly software to create maps that showcase drilling opportunities, he stresses that technology is merely a tool, and it doesn’t replace the expert interpreter. That’s particularly true when it comes to data integrity.

“The final output is only as good as the data you’re using,” Andersen said. “Do you believe what you’re using as your source?”

Andersen’s experience equips him to assess the validity of his data and the resulting map. With his technical expertise and understanding of geology and subsurface concepts, he can look at an area and confidently identify which of its multiple hydrocarbon layers offer the highest potential returns, equipping him to confidently say, “I think we can drill 10 more wells if you go this way.”

Communicating Complex Geology: The Power of Visualization

After analyzing data and gathering insights, the need for clear visual communication is paramount, especially when presenting to non-technical stakeholders. Andersen’s direct audience, the O’Chiese First Nation, are the owners, and while they live on the land, presenting a traditional cross-section diagram using industry jargon wouldn’t be helpful.

“That’s where a nice map is worth a thousand words,” Andersen noted. 

By using Surfer to create intuitive visualizations—and providing graphical analogies along with it—he showcases what’s happening miles beneath his audience’s feet to explain the drilling targets.

“One of the things that was the most convincing was when we showed a story about the rocks,” Andersen said. “We showed a picture of a layer cake and said, ‘These are the layers, and then we’re drilling wells into the pink layer and the blue layer.’ That was huge in terms of getting traction for understanding the graphical, simplistic representation of what we’re trying to do.”

Additionally, Andersen uses Surfer’s 3D View to show how multiple wells target different layers at varying depths. Providing this model is key for clarity, helping the First Nation understand the subsurface reality of their resource portfolio so they can effectively pursue future growth.

This Surfer-generated visual is a net pay map in meters, which can be used for evaluating reservoir potential.
Net Pay Map

This visual is a net pay map in meters that John Andersen created. Maps like these are crucial for evaluating reservoir potential.

Combining Expertise and Software Tools

Andersen’s work illustrates how modern technology and decades of geological expertise can come together to achieve meaningful results. While software plays a vital role in turning his raw data into clear, actionable visuals, it’s also Andersen’s deep understanding of geology—and his ability to interpret what the data truly means—that gives those visuals purpose.

In many ways, his approach represents the best of both worlds: human expertise grounded in experience and amplified by technology. The result is a workflow that helps transform geological complexity into economic empowerment, ensuring the O’Chiese First Nation can continue making informed decisions about its most valuable natural resources.

Want more stories on how geologists and engineers are making an impact around the world? Subscribe to our blog so you never miss an update!

]]>
https://www.goldensoftware.com/subsurface-data-to-guide-drilling-decisions/feed/ 0 15830
How Geophysics Revived a School District’s Construction Plan: The Story of a Once ‘Un-Buildable’ Site https://www.goldensoftware.com/geophysics-revived-a-construction-plan/ https://www.goldensoftware.com/geophysics-revived-a-construction-plan/#respond Wed, 19 Nov 2025 16:49:50 +0000 https://www.goldensoftware.com/?p=15725
This 3D models reveals the geophysical survey results from the project site Andy examined.

How Geophysics Revived a School District’s Construction Plan: The Story of a Once ‘Un-Buildable’ Site

When a  school district out west set out to build a new facility, they needed to know if their preferred location—which included a portion of a soccer field—was suitable for construction. After hiring a geotechnical firm to investigate the site, the school district got their answer: it wasn’t. Based on the results, the site was labeled “un-buildable.” But was it really?  

Unconvinced by the vague conclusions and high uncertainty, the district sought a second opinion. That’s when they brought in a new geotechnical firm that consulted Andy Siemens, P.E., G.E., who is known for his innovative use of geophysics to deliver precise site evaluations.

Another Opinion and a New Approach

The school district’s initial investigation relied on air-percussion drilling methods, which included over 30 borings drilled across and around the soccer field. Although thorough in quantity, the results weren’t incredibly clear. The terrain—a complex volcanic landscape of fractured basalt and thin soils—produced inconsistent drilling responses that were interpreted as air-filled voids scattered unpredictably throughout the area, some estimated to be as large as 20 feet tall and just 10 feet below the surface. With limited confidence in their data, the original geotechnical firm recommended getting a second opinion, and the school district acted on their suggestion. 

The new firm that they hired wanted to use sonic borings to better understand the extent of the voids, but this drilling method was expensive, leading to concerns about exploration costs. To keep the project within the school district’s budget, the geotechnical firm needed to limit drilling expenses by keeping their exploratory borings to strategic locations that were carefully selected. This is where Andy came into the picture. 

The second geotechnical firm consulted Andy to discuss an idea: rather than blanket the field with costly sonic borings, Andy could perform a geophysical survey to map subsurface anomalies and guide the placement of future borings. This would reduce costs while increasing confidence. As it became clear that this approach was necessary, the geotechnical firm contracted Andy, who planned on conducting a survey using electrical resistivity tomography (ERT) in 3D.

But before jumping into the project, Andy knew stakeholder buy-in was essential—especially since geophysical methods are often misunderstood or overlooked. To build trust in the technology and approach, Andy first tested his methods on a nearby site known to have geologic features and air-filled voids similar to the ones that supposedly affected the school district’s preferred location. When Andy’s ERT survey results matched the known conditions, it gave stakeholders the confidence they needed to approve the survey at the school’s preferred site.

“We used software to show the results from that demonstration survey,” Andy said. “We created a 3D visualization of a known air-filled void to show that the method worked and provide confidence that if we apply the same method at the preferred site and there’s something we need to be concerned about, we’re going to find it.”

Revealing Geophysical Survey Results Using A Story-Driven Method

After moving forward with the ERT survey at the official project site, the results revealed a much more accurate picture of subsurface conditions. Andy identified zones of higher and lower resistivity—pinpointing areas where voids might be present, and more importantly, where they likely were not. Using the software, he then built a 3D model of the entire site that became a cornerstone of the project. 

“One of the selling points was the 3D model,” Andy explained. “The stakeholders could look at that and get some confidence that, indeed, the entire site was explored. And if there were any issues or threats to the structure from subsurface conditions, they could be pretty confident that, if they existed, they would’ve been found.”

The 3D model included three critical elements:

  • Contour layers to illustrate terrain slope
  • Isosurfaces to showcase zones of anomalous resistivity
  • Drillholes to highlight the precise locations for sonic drilling

These visual elements were critical, but for Andy, it wasn’t just about visualizing the data. It was about telling a story clearly and convincingly. The software played the biggest role in achieving this—but a video-making platform also helped. 

For Andy’s report, he created a video that walked stakeholders through the model step by step. The video began with an empty 3D box, then progressively revealed the site topography, ERT results, and boreholes—all while Andy narrated what was being shown and why it mattered. As he explained the survey sequence, he revealed each corresponding point in the model, and this interactive, reveal-as-you-go approach helped bring his findings to life.

“You can describe everything in a video in a super cool way to deliver results,” Andy explained. “Clients love it. You’re able to tell a story and reveal it as it’s happening.”

This 3D visualization shows the project site that Andy and his team surveyed.

The Truth Revealed: The Correct Status of the Preferred Site

The combination of geophysics and a story-driven 3D model helped guide the geotechnical firm’s exploratory borings, and their findings matched the results from Andy’s survey, one of which was a minor anomaly. This anomaly turned out to be a zone of unconsolidated pumicite surrounded by basalt—exactly the kind of subtle variation that could’ve been missed without targeted exploration. 

“An anomaly had shown up in the 3D model,” Andy said. “Being able to visualize everything in the area of interest was really important. And the fact that the 3D model showed an anomaly within the zone of interest that the geotechnical exploration targeted was a big deal.”

The geotechnical company also confirmed that the subsurface was vesicular, fractured, and jointed, all of which were characteristics of highly variable rock. However, neither the firm’s sonic drilling method nor Andy’s geophysical survey found large, air-filled voids within the school district’s preferred location, which meant one important fact: the desired location wasn’t “un-buildable” but was actually usable.

Moving Forward with Confidence

Thanks to a combination of geophysical surveying, modern visualizing, compelling story-telling, and innovative drilling methods, Andy and his collaborators helped move a building project forward. The school district got the green light to develop a facility at their preferred location, and construction proceeded as planned. In the end, it was a win for the school district, the geotechnical firm, Andy, and the community that would benefit from the new facility. 

Want to learn more about technologies empowering better decision-making and unlocking new possibilities in geoscience? Subscribe to the Golden Software blog for more stories like this one.

]]>
https://www.goldensoftware.com/geophysics-revived-a-construction-plan/feed/ 0 15725