Getting Started - Developers

Welcome

This getting started guide is intended for developers. See below for instructions on how to install Spark and bring it into your existing build.

Requirements

Spark assumes your project is using Sass for styling and that your project has a JavaScript build step that will transpile ES6+ and polyfill appropriately for browser support.

Spark Core

Vanilla

  • Install the npm package.

    npm install --save-dev @sparkdesignsystem/spark-core
  • Import the Sass setup file in your Sass build. This will bring all Spark-Core Sass patterns into your build. The Spark classes are namespaced so that they don't affect any existing css.

    @import "node_modules/@sparkdesignsystem/spark-core/spark-core";
  • Spark has two JavaScript files that need to be incorporated into your build in order for Spark's behavior to work.

    • spark-core-prerender.js - This file detects if JavaScript is loaded and also sets up the type loader. It needs to execute it's code before the page is rendered and therefore needs to be imported in the head of the document.

      import sparkCorePrerender from "@sparkdesignsystem/spark-core/spark-core-prerender";
    • spark-core.js - This file contains the bulk of Spark's behavior and can be loaded after the page is rendered, this is best done before the closing body tag.

      import sparkCore from "@sparkdesignsystem/spark-core/spark-core";

    There are also ES5 versions if preferred. They're located in @sparkdesignsystem/spark-core/es5.

  • Init the Spark Core JS, passing in a config object (optional).

    sparkCorePrerender({ //config, see below });

    See below for available configuration options:

    Key Description
    typeConfig
  • Import the post-render ES6 setup file in your JS build. This is best done before the closing body tag. This will bring all the Spark-Core JS into your build. There is also an ES5 version if preferred. It's located in @sparkdesignsystem/spark-core/es5.

    import sparkCore from "@sparkdesignsystem/spark-core/spark-core";
  • Init the Spark Core JS, passing in a config object (optional).

    sparkCore({ //config, see below });

    See below for available configuration options:

    Key Description
    datePickerConfig Exposes configuration provided by tiny-date-picker, see github for documentation.
  • Spark does not provide icons directly. What you see below are proprietary icons in use by Quicken Loans. To supply your own icon set, you need to import an svg that contains symbols for the ids referenced on the icon page. This symbol file should occur in the DOM before the first use element.

  • The main content area of your site will also need to have a data attribute set:

    data-sprk-main

Angular

Requirements

Spark's Angular implementation assumes a few things about your project.

  • We assume that your project is processing Sass rather than plain CSS for style information. More information for converting an Angular project to SCSS is available at angular.io.

  • We assume that your Angular project is using at least Angular 6 with Typescript or is an Angular-CLI based project.

  • We assume that you have already installed @sparkdesignsystem/spark-core in your Angular project. Vanilla spark-core is a peer dependency of the Spark Angular packages.

Getting Started

In order to get started with spark-core-angular , you'll need to follow these steps:

  • Install the npm package.

    npm install --save-dev @sparkdesignsystem/spark-core-angular
  • Install the needed peer dependencies for our npm package. When you use the Angular CLI it should automatically install our peer dependencies besides the packages listed below. If your application does not have one of our peer dependencies then your console should show a warning for that missing dependency when you install spark-core-angular.

    • npm install --save-dev tiny-date-picker
    • npm install --save-dev lodash
  • Import the Sass setup file in your Angular application's global styles Sass file. This can be done in the scss file specified at the highest level of your app. This will bring all Spark-Core patterns into your build. The .scss file extension is needed in the case of Angular so as to not conflict with JS files.

    @import "node_modules/@sparkdesignsystem/spark-core/spark-core.scss";
  • Your HTML element needs to have the following class:
    sprk-u-JavaScript
  • You'll need to import the spark-core-angular NgModule in your main app.module.ts file and add it to the NgModule imports array.

    import { SparkCoreAngularModule } from "@sparkdesignsystem/spark-core-angular";
  • You'll need to import the BrowserAnimationsModule in your main app.module.ts file and add it to the NgModule imports array.

    import { BrowserAnimationsModule } from '@angular/platform-browser/animations';
  • Spark does not provide icons directly. What you see below are proprietary icons in use by Quicken Loans. To supply your own icon set, you need to import an svg that contains symbols for the ids referenced on the icon page. This symbol file should occur in the DOM before the first use element.

Spark Extras

  • Install the npm package.

    npm install --save-dev @sparkdesignsystem/spark-extras
  • Import any of the Spark-Extras Sass patterns that you need.

    @import "node_modules/@sparkdesignsystem/spark-extras/components/<pattern-name>/<pattern-name>";
  • Import any of the Spark-Extras JS patterns that you need.

    import <pattern-name> from "@sparkdesignsystem/spark-extras/components/<pattern-name>/<pattern-name>";

Angular

The Spark Angular Extras components rely on the Spark Core Angular npm package and the SparkCoreAngularModule importation documented above. Make sure that is done in order to use the extras components. If your baseURL if not set in your Typescript config file then you will need the additional './' prepended to the path for the module imports.

  • Install the npm package for the Spark Extra that you need.

    npm install --save-dev @sparkdesignsystem/spark-extras-angular-[package-name]
  • You'll need to import the NgModule provided.

    import { Spark[PatternName]Module } from '@sparkdesignsystem/spark-extras-angular-[pattern-name]';

Browser Support

Spark supports the following browsers (current version and 1 prior):

  • Google Chrome
  • Google Chrome (Mobile)
  • Mozilla Firefox
  • Mozilla Firefox (Mobile)
  • Microsoft Edge
  • Apple Safari
  • Apple Safari (Mobile)

Spark also supports the following:

  • Microsoft Internet Explorer 11

Change Workflow and Contribution Guide

If you are interested in contributing to Spark, please read our full change workflow and contribution guide to get started.

Code Style Standards

Spark follows specific coding styles for HTML, CSS and JavaScript to ensure maintainability and scalability. To successfully make a commit in this repo the code must pass the pre-commit hooks that will run automatically on commit. The pre-commit hooks run ESLint, Stylelint and an additional code formatter, prettier.

HTML

  • Two spaces for indentation.
  • For better code readability the attributes of elements should each be on their own line when the number of attributes makes that element exceed a line length of 80 characters.

CSS

JavaScript

  • Two spaces for indentation.
  • Spark JS will use new features from ESNext and assumes applications using Spark have a JavaScript compiler setup.
  • Spark uses ESLint for JS linting.
  • ESLint is setup to use the Airbnb JavaScript Style Guide and Spark JS coding conventions come from there.
  • We follow JS recommendations from the Quicken Loans JS Concord Group.
  • Data attributes on DOM elements is the chosen method for DOM selection.

Code Organization

Spark is managed as a monorepo. All of the Spark source code lives in a single repo, but is released as separate packages using Lerna.

This repo consists of the design system packages, wrapped in an instance of Drizzle, a tool built by cloudfour, for displaying pattern libraries. Spark uses Drizzle for documentation and plain html code examples.

In the packages folder are Spark-Core and Spark-Extras. These are the files that are published to npm.

Running the Spark Docs site locally

  1. npm install
  2. gulp --dev
  3. Open your browser to http://localhost:3000/.