Warning: session_start() [function.session-start]: open(/home/content/30/7423630/tmp/sess_b8tk5jn7bi6b7mgpd95qr9o6l6, O_RDWR) failed: No such file or directory (2) in /home/content/30/7423630/html/wp-content/plugins/simple-twitter-connect/stc.php on line 33

Warning: session_start() [function.session-start]: Cannot send session cookie - headers already sent by (output started at /home/content/30/7423630/html/wp-content/plugins/simple-twitter-connect/stc.php:33) in /home/content/30/7423630/html/wp-content/plugins/simple-twitter-connect/stc.php on line 33

Warning: session_start() [function.session-start]: Cannot send session cache limiter - headers already sent (output started at /home/content/30/7423630/html/wp-content/plugins/simple-twitter-connect/stc.php:33) in /home/content/30/7423630/html/wp-content/plugins/simple-twitter-connect/stc.php on line 33
Tech Archive - TransSwipe - Merchant Services and Credit Card Processing

TransSwipe - Merchant Services and Credit Card Processing

Archive for the ‘Tech’ Category

On-demand Bank Transfers, made easy


In 2011 we released an OAuth API that made it easy for developers to request transfer permissions from their customers with our Dwolla-branded platform.

Today we’re making the ability to bill your customer later available for our Dwolla White Label customers in our v2 API.

It’s called On-Demand Bank Transfers.

Developers using our white label APIs can enable their payers to authorize transfers for variable amounts from their bank account using ACH at a later point in time for products or services delivered. It’s one simple additional step—a quick authorization from the customer when they instantly verify a bank account. This is great for companies like:

  • Cloud computing services. Fees can be different every month, requiring ongoing authorization so a customer can easily pay for a service, and the company can easily bill for the service.
  • Utilities. A water company bill is rarely the same each month. Same with electrical, and gas. The amount collected at the end of each month is usage-based, or metered.
  • Ride sharing or asset sharing platforms. The amount a customer is charged for a ride across town depends on a variety of factors. We make it easy for sharing companies to bill their customers, while reducing the hassle for the end customer on each trip.
  • B2B services that bill on a variable basis. Some orders may require a bank transfer on NET terms and others may be fulfilled once the goods are delivered. Either way, both should be possible.

The instant bank verification and on-demand authorization occur with Dwolla.js to make it incredibly easy to add to your software. It adds an extra step to the bank transfer process when verifying the bank to acquire the permission for the account holder.

On-demand payments from Dwolla

Once you have collected all of the authorizations required for a bank transfer, including the additional authorization from the end user for on-demand bank transfers, your software application kicks off a transaction that looks like this whenever the customer needs to be billed:

    "_links": {
        "source": {
            "href": "https://api-uat.dwolla.com/funding-sources/5cfcdc41-10f6-4a45-b11d-7ac89893d985"
        "destination": {
            "href": "https://api-uat.dwolla.com/customers/C7F300C0-F1EF-4151-9BBE-005005AC3747"
    "amount": {
        "currency": "USD",
        "value": "225.00"
    "metadata": {
        "customerId": "8675309",
        "notes": "Payment for January 2016"

As usual, there are no per transaction fees for either party in the transaction, and with our white label services your brand is front and center. If you’re a developer and have more questions, head to our API Support discussion board and post your questions. Our engineering team regularly reads and responds on the board.

The cost and complexity to make bank transfers better isn’t unknown and it affects new fintech companies, large established business, and even big VC’s… We all have the same problem.

We didn’t either and that’s why our team has spent the last 5 years focused on bank transfers. It’s why we feel on demand bank transfers are so incredibly valuable for our customers.

There is a better way and we’re excited to be a part of building the future making bank transfers easier for businesses and developers.

Contact us to enable for your application

Fake it as you make it: why fake services are awesome for developers


This blog post comes from Shea Daniels, a developer here at Dwolla. When Shea isn’t busy building awesome new things, you can find him out for a run. api-blue

It’s often said in life that we “stand on the shoulders of giants.” This rings especially true now that we’re in an era of abundant open source software and SaaS providers. Now, more than ever, we build applications by relying on tools and services that others have made. This may even be standard practice inside your own organization as other teams deliver functionality through a microservices architecture.

Building software by composing services is extremely powerful, but it can still be a rocky road. Several factors can make it difficult to write and test your code:

  • Complex scenarios may not be easy to test with real data
  • Running elaborate business logic may consume resources
  • Sandbox environments may not exist for 3rd party APIs

Just fake it

So what can be done to mitigate these issues? The answer is to fake it while you’re making it!

You can see this in everyday life. Whenever the real thing is too expensive or impractical, we sub it out with something fake as a stand in—think movie props or mannequins for tailors. This is also a fairly common engineering practice; my favorite examples are the boilerplate capsules used to evaluate rockets and other space hardware.

In the software world, if you practice TDD you should be familiar with the use of test doubles (mocks, fakes, and stubs) for dependencies in your unit testing code. Used instead of the real implementations of objects, fake dependencies isolate the code under test by providing predictable results given certain input. This isolation is useful for tracking down issues and fully exercising your code without complicated setup.

The same concept can be applied when developing an integration with a third party service. By building a fake copy of the web service, you gain the same advantages of isolation and repeatability as you test your application. This is especially useful if the service you’re depending on is being developed in tandem and has yet to be fully implemented.

There are some existing tools for quickly standing up your own fake services, such as Nock and Frock. But with Node.js and a few NPM packages, it’s easy enough to build your own from scratch.

In this post we’ll include:

  • An actual example
  • How to get started
  • Possible scenarios
  • Some of the downsides

A real example

Let’s break down a real example that Dwolla has recently open sourced: Nodlee. You can see it in action by checking out our instant bank account verification demo—here it’s used as a backing service.


Getting started

Nodlee is a simple web server written in Javascript and run via Node.js. It depends on the following NPM packages, which you can see in the package.json file:

minimist – argument parser library

If you haven’t used Node or express before, there are a ton of great tutorials, or you can read through the Nodlee source code to get a feel for it. The readme has a lot of great info and the code entry point is app.js.


The first thing to do when building out a fake service is to look at the documentation for the real API and experiment with the service to discover how it works. With that knowledge, you can figure out which endpoints need to be mocked and what the responses should look like.

For a simple example, here’s the Nodlee health check endpoint response: health.js

module.exports = function (req, res) {
  res.json({ healthy: true });

These responses can be as simple or as complicated as needed. If returning the same canned response every time isn’t enough, consider scanning the request for sentinel values that you define. Then you can use those values to decide which data to send back. You can even use a templating language like Handlebars for generating your responses if you want to get swanky.

Complex scenarios

For the instant account verification product we were building, even sentinel values and templating weren’t quite enough. We found that we were constantly editing multiple files in the fake service code to set up complex scenarios.

The first step to making this easier was to consolidate all of the possibilities that determined a particular scenario into a single list of options in the code: scenario.js

module.exports = function Scenario() {

	this.contentServiceId = 0;
	this.isMfa = true;
	this.mfaTimeout = false;
	this.mfaPass = true;
	this.mfaTypes = ['q'];
	this.refreshesSet = 1;
	this.refreshesLeft = 1;
	this.errorType = 0;
	this.accounts = [
			routingNumber: '222222226',
			accountNumber: '5031123001',
			accountType: 'CHECKING',
			accountHolder: 'John Wayne',
			accountName: 'Your Account #1',
			balance: '1000'

This object can then be checked in all of the service endpoints in order to determine the appropriate response. With this in place, developers could set up the flow to behave however they wanted just by editing this single file.

Sentinel values on steroids

We have local development covered now, but what about testing in sandbox environments where we can’t edit the fake service code? Not only that, but what if we wanted coverage of our flow with automated UI tests (e.g. Robot Framework)?

What we need now is a service with a memory longer than a single web request and a way for automated tests to trigger whatever scenario is needed. This is where the minimist and node-cache NPM packages come into play.

With minimist, we are able to take certain inputs in a web request and treat them as if they were command line interface options. Those options can then be translated into order to set the properties of the Scenario object we’ve just discussed: 

var scenarioData = optionsParser.parse(req.body.someInput)


exports.parse = function(options) {

	var data = new Scenario();

	if (options.indexOf("-") < 0) {
		return data;

	var args = parseArgs(options.split(' '));

	if (args['nomfa'])
		data.isMfa = false;


	return data;

Now that we have the options set for the scenario we want, we use node-cache to persist it across web requests: scenarioManager.js

var cache = new NodeCache({ stdTTL: 900, checkperiod: 300 });

exports.set = function(...) {


	cache.set(userSessionToken, scenarioData);
	return scenarioData;

Now we can use the cache to access the scenario that’s currently being tested at any point we need to build a response: getMfaResponse.js:

module.exports = function(req, res) {

    scenarioManager.get(req.body.userSessionToken, function(scenarioData) {

        if (!scenarioData.mfaTypes) {
        } else if (scenarioData.mfaTypes.length < 1) {
        } else {

The downsides

As with anything, fake services are not a silver bullet. There are a few caveats to keep in mind:

  • Are you sure you understand the real service well enough to build a fake version of it?
  • Watch for edge cases where you may not be duplicating the behavior of the real service
  • If you do find edge cases, be sure to cover them with appropriate tests; manual testing with your fake service is not a replacement for good test coverage with unit/integration tests
  • Plan for regular maintenance of your fake to keep up with any changes in the interface or behavior of the API you depend on
  • Using a fake does not relieve you from the task of running your code against the genuine article

The last bullet point is important since there’s a large difference between “should work” and “actually works.” At some point in your workflow you’ll need to test the full production stack!


Here at Dwolla we’re committed to making developers’ lives easier by providing a great API for moving money with ACH transfers. We’ve found the concept of fake services to be invaluable in making this happen. If you found this useful, please share this article and comment with your own experiences. Happy building!

Check out the documentation

Your tips for better information security

In this post Ben Schmitt, Dwolla’s Information Security Risk Manager, explores tips for improving your personal information security. Read more from Ben on security here

Screen Shot 2015-12-03 at 9.50.33 AM

The holiday season is here which may increase the likelihood for online scams and phishing attempts. For better protection, certain InfoSec back-to-the-basics practices are worth repeating.

Be cautious with emails:

Don’t click on unsolicited and mysterious email links. This is a judgment call and you may want to consider the following:  

  • Check to see who sent the email—If you don’t recognize an email address, don’t click.
  • Hover over the linked content to discover where it will direct you—If you aren’t familiar with the destination address, don’t click.
  • Check the signature to see if matches the sender’s standard or expected format—If it is out of place, don’t click.
  • Check the email header—If it doesn’t match the sender’s name, don’t click.

Be a helper and report suspicious spam or phishing emails. Email providers often provide a way to flag and/or report an email as spam.  By reporting, you may help prevent others from falling victim.


Enable Two Factor Authentication (2FA) wherever possible. Dwolla provides 2FA to our customers and many other services you may use do as well (Twitter and Amazon for example).  Using more than a single ‘checkpoint’ for authentication may be one of the best things you can do to protect your information and prevent Account Take Overs (ATOs). In the event your credentials are stolen or guessed, an attacker only has something you know, not something you physically have.

Use a password management tool. The age-old problem of too many passwords remains a pain for everyone, however all hope is not lost as a password management tool can help with generation, storage and use of credentials.  A password management tool can help limit reuse by using unique passwords across platforms and can keep easily obtained information out of your password/passphrase by generating them for you. Excellent solutions exist to manage these high-security, complex passwords—try LastPass, 1Password or KeePass.  

Maintain your software:

Remember to patch and/or update your operating system and applications. This is a basic and expected maintenance activity just like changing the oil and refilling the windshield wiper fluid in your car. You can even automate these tasks to make life easier.

This includes complex software such as Adobe Acrobat Reader/Flash and Java. It is not enough to patch just the operating system. If you don’t need it, delete it.  For example, you have a choice in PDF readers and now that HTML5 is here, Flash may well be optional (FYI, YouTube works fine without Flash).  If you can’t live without Flash, enable click to play as a mitigation.

Ensure endpoint protection:

Endpoint protection (aka anti-virus) remains critical to your security, although it should not be your only layer of protection. Endpoint protection software must also be updated; failing to update endpoint protection software is marginally better than running with no endpoint protection at all.

Device firewall enabled:

Enabling your firewall is an easy step to protect your device.  By default, most device firewalls will deny inbound traffic but allow outbound traffic. This simply means your device is allowed to connect to other locations but doesn’t offer any services by default. A good example of why this is important: a coffee shop. Typical coffee shops use commodity wireless networks which are massively shared and rarely monitored. If your laptop is not running a firewall, you may expose vulnerable services such as Windows file sharing, Secure Shell (SSH), etc.  Refer to Apple OSX Firewall Guidance or Microsoft Firewall Help to learn more about how to enable a device firewall.

Secure your network:

Consider using a network security provider such as OpenDNS. This service is free for home/personal use and helps to provide safe browsing by blocking known phishing sites, improving speed and even offering parental controls. This solution can be applied at your network router which is really powerful. Why? Because all devices course through the router for policy and DNS. This means that every tablet, laptop and Smart TV in your house benefits from DNS protection without having to secure each individual device.

Backup, backup, backup:

Modern backups can be done in-home with a small storage device (such as an Airport Time Capsule or personal storage device) or via a cloud service.  In the event of a hardware failure or accidental deletion, your data should be available to restore via a backup.

Aside from the obvious advantage of being able to restore data, a backup can be a blessing in the event of ransomware.  Ransomware is a specialized form of malware which uses strong encryption to encrypt your valuable data (think of all Office documents being lost) and hold it ransom.  A backup solution allows for a data recovery without negotiating with attackers.


The above are a collection of leading InfoSec practices which may be applied broadly to help improve end user protection.  It is important to stress that security is a process and there is no list available which removes all risk. Similarly, security is process without a defined end—we hope these tips help you on your journey.

At Dwolla, we are never done building, and that holds true for security as well. We are never done building a more secure way to move money. Learn more about security at Dwolla now

Thanks to CME Group, it’s a big day for real-time in the U.S.

This is a bit different as a blog entry, mainly because its content involves a full press release at the end which is hard to understand if you’re not operating in the futures market, or any other market for that matter.

Rather than re-post the press release I’m going to explain why it’s important for Dwolla, but moreso, why it’s special and meaningful to everyone working on real-time initiatives in this country.

CME Group is making markets more efficient

CME Group is the world’s leading and most diverse derivatives marketplace. The company will leverage Dwolla’s real-time architecture, FiSync, for on-demand payments. This means it will use a real-time framework to make things better for itself and its customers.

This means a gigantic driver in our economy is going to fire up real-time using Dwolla’s FiSync. That’s good news for real-time in this country and a big step forward for everyone who cares about upgrading our country’s payment systems.

It’s a big day for everyone who works on real-time systems because, as far as we know, it’s the first time a systemically important enterprise in the U.S. will begin building with a new real-time payment framework. It’s a stake in the ground, not just for Dwolla’s technology, but for the market—that change isn’t just coming, it’s upon us.

This isn’t going to change how millions of dollars in transactions are processed. It’s going to change how billions of dollars are processed. That’s exciting.

We’re thrilled to see the application of real-time technology in large organizations in ways which drive savings and revenues for all the parties involved.

Central Counterparties (CCP) are the center of the market

CCP is an easy acronym to remember, but kind of hard to imagine for people who don’t interact with markets…which is most of us. So, I’ll make an analogy that will hopefully make sense to people who aren’t in it and not upset people who are.

A CCP is like the operator of the master application behind a market. It is the organization that manages risk and moves money based on what’s happening in the market. A CCP, from a network perspective, is the center node on a very complex network. If you take it out, the markets don’t really work. If you speed up what it does, the markets get more efficient.

We’re working with CME Group to make the markets more efficient through the implementation of Dwolla’s network.

In a nutshell, a CCP manages software that moves assets (money) in the market. When those assets move faster, everyone wins.

Your market and product, our infrastructure

A few months ago Dwolla made some important changes in our company. We dropped transaction fees, introduced new pricing for businesses who need specialized features, and debuted our white label solution.

These changes were driven by our relationship with CME Group and our continued focus on building a better network. One that empowers other companies to leverage the efficiencies Dwolla’s platform provides without changing the way they do business.

We’re excited about the work that we’re doing with CME Group. Ultimately, it’s the same network the people reading this blog use everyday.

Read the full press release here.

Ask Hard Questions, Build A Better Product

This blog post provides insights from our VP of Product, Brent Baker.

Over the past few years one trend has emerged that is critical to the FinTech world—in-app payments. Whether it’s paying for coffee at Starbucks with your phone, ordering dinner from Postmates, or stepping away from your ride-share, real-world transactions are now seamlessly integrated into apps.

My favorite part of the ride-sharing experience isn’t the ability to schedule the ride at the beginning of the experience—those who have ever tried to hail a cab in NYC on a Friday in the rain might disagree—but rather the simple act getting out of the car with zero hassle at the end. This shift in consumer expectations led the Dwolla product team to question many of our assumptions about how Dwolla fits into the FinTech ecosystem—a healthy introspection most product teams should do on a regular basis.

Talk to Partners

We asked current and prospective partners detailed questions about what their expectations were for bank transfers, challenging them to think outside the current Dwolla product offering to outline their needs. We weren’t afraid to press them to tell us where they felt the product fell short—we were eager for honest feedback.

Dwolla Partners

Over and over we heard the same issue: friction when another brand is introduced in the transaction flow. Our existing and potential partners made it clear that Dwolla’s value wasn’t in providing a network of existing users (e.g PayPal) but rather that Dwolla allowed them to build an experience tailored to the needs of their customers. Partners wanted Dwolla to simplify the flow of funds between financial institutions and to provide their applications with a robust set of payment features. The value Dwolla could provide was this feature-rich platform.

Respond to Feedback

These discussions led to two significant changes in our product:

Brent Blog 4

Execute, Execute, Execute

We knew what we had to do, so on to the execution! For anyone who knows me, this is my favorite part of the process. To begin, we had to make a tough decision: choose a build-fast path layering White Label functionality onto our existing API or design a new API almost from scratch. Redesigning our API was 5x the effort at a time where we needed to move fast, but we made the right decision to invest in a vastly improved API that supports White Label, Direct, and traditional Full Dwolla accounts.

Dwolla White Label API

A key factor in our decision was the desire to re-imagine the developer experience. We took this opportunity to make existing endpoints more efficient and slim down the entire service by gutting features that were no longer driving value. The new API, V2, is currently live and we are in the process of migrating existing functionality and scopes from V1. We’ll reach out to existing API partners with a reasonable, thoughtful transition plan to help migrate over to the more robust API.

Round out the solution

To complement the new version of the Dwolla API, we are more deeply integrating the developer experience within the entire Dwolla website. More news on this front will be coming soon.

Dwolla Developers Portal

We Are Never Done

This directional shift is a continuous effort and we never stop learning and questioning. The Dwolla product team actively participates in all qualified sales calls and remain engaged once the contract is signed and partner development begins. We do this to understand both the integration challenges and the areas of the product that delight our partners. This level of participation ensures the voices of our customers are represented in the product we build with our partners in Development, Legal and Compliance.

When I look back on 2015, I am incredibly proud of what the product team and this organization have accomplished. A directional product shift of this significance requires open minds, a strong leadership team, and a willingness to question the product we have in market. We are never done.

How to make a GIF in Photoshop (for your perfect demo)

How to make a gif header image

Whether you’re selling chocolate chip cookies or access to an API, we all know the importance of showing, rather than telling when promoting a product. For chocolate chip cookies, you offer a free sample. However, how do you show the power of an API or piece of technology?

My answer (as a marketer for a tech company who has been presented with this challenge): craft the perfect GIF.

Here’s why:

  • You can add your GIF to pretty much any presentation. Need to show a demo, but don’t want to leave your slide deck, a GIF works perfectly.
  • Embedding a GIF into a blog post. Rather than scrolling through an endless stream of screenshots outlining each step of a product, insert a GIF.
  • GIFs are way smaller file sizes than a video, and you can alter the settings to make it even smaller a file size than those websites that do build the GIF for you.

What you need:

  • Adobe Photoshop
  • QuickTime (or something to record your screen)
  • A sweet product to show off (we used it for Dwolla FiSync)

The Instructions:

1. Prep

Before you dig into creating your demo GIF, spend time mapping out your screen recording. If this GIF is serving the purpose of demonstrating a function of your technology, you want it to go smoothly; the best way to ensure this is to outline a plan.

2. Record

To record the video that will be turned into a GIF I use QuickTime, which comes standard on most computers. Open QuickTime, go to the top left-hand corner, select ‘File’, then ‘New Screen Recording’.

Screen Shot 2015-10-21 at 9.55.00 AM

When beginning your screen recording, hit the record button and drag your cursor to record the part of the screen you want to include in the demo.

3. Import into Photoshop

Start by opening Photoshop, and selecting ‘Window’. From there make sure ‘Timeline’ is selected. This will allow you to edit GIF settings in future steps.

Screen Shot 2015-10-21 at 10.15.04 AM


Next, import your video by selecting ‘File’ > ‘Import’ > ‘Video Frames to Layers’.

4. Set your parameters for the GIF

After you have selected to import the video, you will be able to set parameters for the GIF. There are two primary things to tinker with here.

First is the start time. Adjust on the video timeline where you would like the GIF to start using the small, dark half pointer. Likewise, adjust where you would the the GIF to end using the other small pointer. Note: the larger pointer allows you to view your video at different points, but does not actually adjust the timing of your GIF.  

Screen Shot 2015-10-21 at 10.31.56 AM

You can also adjust the number of frames imported. In Photoshop you are limited to 500 frames to create your GIF, so if the video you’re importing is fairly long (more than a minute), you’ll want to adjust the frame count. When I’m uploading, I usually choose to limit to every 15 frames.

Adjusting frame count essentially means you’re taking out moments of the video. The higher the frame count, the smoother the GIF. If you want a GIF that’s jumpy, bump your frame count up.

You can play around with this, but you’ll have to start over the upload each time you want to try a new frame count.

5. Perfect the timing of your GIF

Once you’ve imported your video, Photoshop will look something like this:

Screen Shot 2015-10-21 at 10.41.51 AMNow is where you fine-tune your GIF. The general lay of the land:

  • In the lower left corner of Photoshop are your play and pause buttons. These allow you to test out how your GIF will flow before exporting.
  • There is also a down arrow that controls your GIF loop. If you want the GIF to play continuously (like most GIFs) select ‘Forever’

Screen Shot 2015-10-21 at 10.58.21 AM
On the timeline across the bottom, you’ll notice an icon for each of your frames. You can delete frames and control the length of time you spend viewing each frame. For instance, if I wanted one frame to get more attention and remain on the screen longer, I would set it to have a longer time. You can alter this timing as much as you would like. If you want to breeze by a certain section, set the timing on all the frames in that section to a shorter length.

Note: If you want to quickly adjust all the frame times in your project, go to the icon made up of an arrow and four, small horizontal lines  on the top right-hand side of your timeline and click. In this list of options, choose ‘Select all Frames’. Then you can adjust the time on all the frames with just a few clicks.

6. Saving your gif for use

The final step of creating a gif is saving it properly. Go to ‘File’ then select ‘Save for Web’.

Screen Shot 2015-10-21 at 11.07.28 AMFrom here, a box will pop up. Here you can:

  • View the size of the file
  • Alter the number of colors in the GIF to reduce the file size
  • Change the image size
  • Play around with dithering, etc.

Screen Shot 2015-10-21 at 11.10.27 AMOnce you’ve gotten the GIF perfected, you just hit save and you’re good to go! Now you can use your GIF in slide decks, blog posts, or tweets to easily show just how powerful your product is.


Is your file size too big? If yes…
Try altering the number of frames in your project. The fewer the frames, the smaller the file. Also try reducing the number of colors or image size you set in the ‘Save for Web’ box.

Does your GIF track too fast? If yes…
Go back to your timeline, select all frames. Click ‘Other’ instead of the preset times provided. Here you can add a few milliseconds to your delay time.

Do you need to cut out a section? If yes…
Select the multiple frames you no longer wish to include then go to the menu button in the top right of the timeline. Here select ‘Delete Frames’ and that section will be removed.

Do you need to ADD a section from another video? If yes…
If you want to splice two videos together and make a GIF, import both videos as explained above. Then, select the section of one video, in the menu on the right of your timeline click ‘Copy Frame’. Next move into your other project, click the frame you would like to import next to and click ‘Paste Frame…’ in the menu.





From hackathon project to cornerstone product: a look at Dwolla MassPay

It was 2012 when Dwolla first introduced MassPay. Then, the product was described as a “lightweight tool that allows you to pay up to 2,000 people in just a few clicks. Today, MassPay has become one of our most sought after and powerful products, giving businesses the ability to upgrade their disbursement process and replace the paper check.

In this post, we’ll take a look at the evolution of MassPay and explore how a simple hack built during a Dwolla Hackathon forged a path to become a staple product and what we can learn from that process.

A Perceptive Hackathon Project

At Dwolla, we believe that we are all inventors and creators, but simply printing our value statements in the hottest typography and hanging them on the wall isn’t enough. To foster new ideas and cross-functional collaboration, we host internal hackathons and encourage team members to develop products or features that add value to the business. Once a hackathon wraps, team members can submit hackathon projects as candidates toward the upcoming planning cycle.

MassPay was built during a hackathon as a simple tool to payback multiple vendors or contractors at once. Manually sending hundreds of individual transactions via Dwolla.com wasn’t logical; it simply induced carpal tunnel. A simple and efficient means of sending mass payments was top of mind in many sales discussions.

MassPay Dwolla

The first iteration of MassPay was built with accountants and bookkeepers in mind; CSV files were part of their daily regimen, and required no previous technical knowledge. All one had to do was fill in email address and amount owed, upload the CSV file to the web app, and submit to process the job.

Killing the Check

As adoption of MassPay increased, it was evident that businesses were leveraging the tool to replace their outdated check operations. MassPay was a powerful solution for merchants cutting hundreds of checks each month, and the benefits were clear:

  • No more purchasing stamps or losing checks in the mail
  • Less information is required to send the payment—only email address and amount needed
  • Increased control of cash flow from the business to the recipient


Next up, MassPay in the Dwolla API

By making MassPay available in the Dwolla API, we made it easy for developers to integrate programmatic batch payouts into their applications and gave larger, more established platforms the freedom to implement a disbursement process that complimented their existing infrastructures.

A good example is Clear Capital, a nationwide provider of residential and commercial real estate services for mortgage lenders, services, investors, etc. Instead of sending only paper checks each month, Dwolla now processes a wire transfer from Clear Capital same-day and disburses the amounts owed to thousands of vendors via the Dwolla MassPay API, routing funds directly to the vendor’s respective bank or credit union accounts. With this new flow, Clear Capital is able to cut transfer times to just 1-2 business days.

Today: Lessons to learn

A few years ago, MassPay started out as a simple idea and a clever hackathon project. Today, many Dwolla merchants have replaced their outdated disbursement process and thousands have received payments via the MassPay API.

Do you need a custom payments solution for your business, platform, or organization?

Let Dwolla Help you build it

Building a data platform, embracing events as the atomic building blocks of data

Dwolla Data Visualization

Here at Dwolla, one of our core beliefs is that money = data. However, there is more to this than just dollars and cents—it takes careful investigation from a dedicated team working to answer questions and measure products. Over the last year we’ve been building a data platform to facilitate this measurement. In order to do this, we needed to effectively collect and analyze data.

Gathering atomic events

Database design has typically thought of atomicity in relation to database transactions, operations that are all or nothing, indivisible or irreducible. However, for data to be semantically atomic, it needs to have meaning that is indivisible or irreducible. This goes beyond just the current state of something, but considers the timing and how that state has transitioned or changed. We need semantic atomicity because:

  • Questions need to be answered, but the data to answer those questions is unknown.
  • Data needed to answer a question is known. In its current form, it cannot answer the question (easily).

Events are a natural fit to provide the flexibility and data we need. Events describe what happened in their identity and structure, when it happened, and as time based facts, are a concept that others have also embraced. Since these events are atomic and are the smallest pieces of data, we can then compose events together to answer future questions.

Once we’ve defined these specific and immutable (unchanging) events, we simply need to gather it at scale and make it queryable. This allows for us to quickly transform this data to easily answer any question.

Analyzing events at scale

We now have a proliferation of data, a metaphoric explosion of events in “big data”. However, since we have built our data platform on Amazon Web Services, we are able to leverage the following infrastructure:

  • Low cost, ubiquitous, and flexible storage of events in S3
  • Batch based aggregation and analysis with Elastic MapReduce
  • Real time aggregation and analysis with Redshift

This suite of tools allows us to then analyze data across the data structure spectrum.

Data Tool SpectrumIn designing our data architecture we strive to adhere to these three principles:

  • Immutable events, immutability changes everything
  • Idempotent pipelines, applying any operation twice results in the same value as applying it once
  • Transformations are state machines, small composable steps with well defined transitions from one state (data value) to another

Events are immutable and data transformation is a one way street. Because of this, we can archive, tear down, replace, and calculate values derived from our events. As long as we have the original atomic events, data at its source is never lost.

We’re especially excited to release parts of our data stack as open source and will be presenting some of our experiences at upcoming events like Tableau’s TC15 conference: “Dwolla, Tableau, & AWS: Building Scalable Event-Driven Data Pipelines for Payments”.

We’ll be taking a deep dive into some of the concepts I have shared as well as the nuts and bolts behind our data architecture, in particular, how we’ve automated our pipelines with a soon to be released open source project, Arbalest. Finally, we’ll show how applications (like Tableau’s business intelligence platform) can leverage our data platform.


fred galosoThis blog post shares insights from Fredrick Galoso, a software developer and technical lead for the data and analytics team here at Dwolla. Fred has led the creation of our data platform, including its data pipeline, predictive analytics, and business intelligence platform. In his free time he contributes to a number of open source software projects, including Gauss, a statistics and data analytics library.



Introducing Two-Factor Authentication for Dwolla

Security is always top of mind at Dwolla, and it’s something we’ll never stop improving and iterating upon. While Dwolla has always required multiple elements for user sessions, such as email address, password and PIN, we’ve continued to work toward empowering our users with additional security measures.

Today, we’ve released the ability for full Dwolla account holders to enable two-factor authentication (2FA) on their accounts. By enabling 2FA, Dwolla members equip themselves with an additional layer of security in account protection.

How do I enable two-factor authentication?

Visit your account settings page within the Dwolla dashboard. You can navigate to this page by clicking on your avatar in the top right hand corner of your dashboard. From your account settings page, choose Security from the menu on the left.

Account Settings Page

You’ll notice the option to enable 2FA on your account security page. Choose to enable and re-enter your password.

Password on Security

When enabling 2FA, you will need to download and open an authenticator app, such as: Google Authenticator (iOS, Android), Duo Mobile (iOS, Android), Amazon Virtual MFA (Android), or Authenticator (Windows Phone).

Open your authenticator app of choice, and manually enter the key code or scan the QR code you’ll see on your Dwolla dashboard to generate a six-digit security code within the app.

iphone 2fa screenshot
Enter this six-digit code in step three to enable two-factor authentication on your Dwolla account.

Two Factor Enable

Next time you login to your Dwolla account from any device, you will be prompted to supply a six-digit security code from your authenticator app after you enter your email and password. You can choose to supply this code every time you log in from that device or every 30 days.

Security Code Screen

Why is two-factor authentication important?

Two-factor authentication helps protect your Dwolla account from the loss of credentials (e.g., your password being stolen). With 2FA enabled, a valid session requires something you know (your userID/Password) and something you have (your 2FA Time-based One Time Password). In short, it helps prevent online identity theft as a victim’s password is not enough for a fraudster to compromise an account.

Why use an authenticator app?

Dwolla chose Time-based One Time Password (TOTP) as our method of two-factor authentication given customer feedback and the high security level provided via the TOTP protocol. TOTP is also extremely strong as no transmission of the passcode is ever made as opposed to SMS (text) which, although unlikely, may be intercepted.

Have questions or feedback on this release? Please respond in the comments below, or on the Dwolla discussion board.

Real-time API status updates


As a software developer, I rely on many different services to get my work done: GitHub, Travis-CI, etc. Since my output depends on these services, I subscribe to their status pages, where I can track their virtual heartbeat and plan my work around any downtime they may experience. It only makes sense that we could, and more importantly, should offer this kind of service to our users and developers.

Say hello to Dwolla’s new status page!

As much as one can go to great lengths to avoid them, network and software issues that may impact availability of a service are simply a reality of any service provider.  As part of our commitment to providing a reliable and secure network, we think that timely notification of any incidents impacting our platform is essential.

What does it do? How does this benefit me?

By visiting the status page, you can check the status of all of the services that Dwolla offers, from our main website to our production and sandbox APIs. You can also go one step further and click “Subscribe to Updates” to receive e-mail or SMS notifications when the status of any Dwolla service changes.

In short, it will be the first place to go to find out if:

  • A service is down
  • A service is experiencing degraded performance
  • Any scheduled maintenance will occur
  • Unexpected errors are impacting functionality

When an incident is reported, you can track its status as we provide real-time updates during the course of its investigation, triaging, and eventual resolution.

We think that having real-time status updates are an important step in building a better real-time network. Let us know what you think.


©2018 TransSwipe


Warning: Unknown: open(/home/content/30/7423630/tmp/sess_b8tk5jn7bi6b7mgpd95qr9o6l6, O_RDWR) failed: No such file or directory (2) in Unknown on line 0

Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct () in Unknown on line 0