Creating an object stream from a JDBC ResultSet

The introduction of  features Stream API and Lambda in Java 8 enables us to make an elegant conversion from a JDBC ResultSet to a stream of objects just providing a mapping function. Such function could be, of course, a lambda.

Basically, the idea is to generate a Stream using a ResultSet as Supplier:

public class ResultSetSupplier implements Supplier<T>{

		private final ResultSet rs;
		private final Function<ResultSet, T> mappingFunction;

		private ResultSetSupplier(ResultSet rs,
                    Function<ResultSet, T> mappingFunction) {
			this.rs = rs;
			this.mappingFunction = mappingFunction;
		}

		@Override
		public T get() {
			try {
				if (rs.next())
					return mappingFunction.apply(rs);
			} catch (SQLException e) {
				e.printStackTrace();
			}
			return null;
		}
	}

Parameter mappingFunction, which might be a lambda expression, is used to build T instances from a ResultSet. Just like ActiveRecord pattern, every row in such ResultSet maps to an instance of T, where columns are attributes of T.

Let’s consider class City:

public class City{
		String city;
		String country;
		public City(String city, String country) {
			this.city = city;
			this.country = country;
		}
		public String getCountry() {
			return country;
		}
		@Override
		public String toString() {
			return "City [city=" + city + ", country=" + country + ";]";
		}
		@Override
		public int hashCode() {
			final int prime = 31;
			int result = 1;
			result = prime * result + ((city == null) ? 0 : city.hashCode());
			result = prime * result
					+ ((country == null) ? 0 : country.hashCode());
			return result;
		}
		@Override
		public boolean equals(Object obj) {
			if (this == obj)
				return true;
			if (obj == null)
				return false;
			if (getClass() != obj.getClass())
				return false;
			City other = (City) obj;
			if (city == null) {
				if (other.city != null)
					return false;
			} else if (!city.equals(other.city))
				return false;
			if (country == null) {
				if (other.country != null)
					return false;
			} else if (!country.equals(other.country))
				return false;
			return true;
		}
	}

The mapping function for City objects could be a lambda expression like the following:

(ResultSet rs) -> {
                try
                {
                  return new City(rs.getString("city"), rs.getString("country"));
		} catch (Exception e) {
		  return null;
		}}

We have assumed database columns are called city and country, respectively.

Although both PreparedStatement and ResultSet implement AutoCloseable interface, as a  resultSet must be provided to create the object stream, it does make sense to close such resultSet when the stream is closed as well.

A possible approach could be to use a proxy to intercept method invocation on the object stream. Thus, as close() method is invoked on the proxy, it will invoke close() on the provided resultSet. All method invocations will be invoked on the object stream as well, in order to be able to provide all Stream features. That is easy to achieve using a proxy.

Let’s have a look. We will have a proxy factory and a invocation handler:

public class ResultSetStreamInvocationHandler<T> implements InvocationHandler{

  private Stream<T> stream; // proxy will intercept method calls to such stream
  private PreparedStatement st;
  private ResultSet rs;

  public void setup(PreparedStatement st, Function<ResultSet, T> mappingFunction)
  throws SQLException{
    // PreparedStatement must be already setup in order
    // to just call executeQuery()
    this.st = st;
    rs = st.executeQuery();
    stream = Stream.generate(new ResultSetSupplier(rs, mappingFunction));
  }

  @Override
  public Object invoke(Object proxy, Method method, Object[] args)
  throws Throwable {

    if (method == null)
      throw new RuntimeException("null method null");

    // implement AutoCloseable for PreparedStatement
    // as calling close() more than once has no effects
    if (method.getName().equals("close") && args == null){
    // invoked close(), no arguments
      if (st != null){
        st.close(); // closes ResultSet too
      }
    }

    return method.invoke(stream, args);
  }

private class ResultSetSupplier implements Supplier<T>{

  private final ResultSet rs;
  private final Function<ResultSet, T> mappingFunction;

  private ResultSetSupplier(ResultSet rs, Function<ResultSet, T> mappingFunction) {
    this.rs = rs;
    this.mappingFunction = mappingFunction;
  }

  @Override
  public T get() {
    try {
      if (rs.next())
        return mappingFunction.apply(rs);
    } catch (SQLException e) {
     e.printStackTrace();
    }
    return null;
  }
}

}

Please note how invoke is used to intercept method calls. In case close() is called, close() is called on PreparedStatement as well. For every method called, the corresponding method call is invoked in the stream being proxied.

And the factory:

 

public class ResultSetStream<T>{

	@SuppressWarnings("unchecked")
	public Stream<T> getStream(PreparedStatement st,
            Function<ResultSet, T> mappingFunction) throws SQLException{
		final ResultSetStreamInvocationHandler<T> handler =
                    new ResultSetStreamInvocationHandler<T>();
		handler.setup(st, mappingFunction);
		Stream<T> proxy = (Stream<T>) Proxy.newProxyInstance(getClass().getClassLoader(),
                new Class<?>[] {Stream.class},
                handler);
		return proxy;
	}
}

To put it all together, let’s write a simple test to show usage. Mockito will be used to mock both PreparedStatement and ResultSet to avoid running tests against a real database.


public class ResultSetStreamTest {

	private class City{
		String city;
		String country;
		public City(String city, String country) {
			this.city = city;
			this.country = country;
		}
		public String getCountry() {
			return country;
		}
		@Override
		public String toString() {
			return "City [city=" + city + ", country=" + country + "]";
		}
		@Override
		public int hashCode() {
			final int prime = 31;
			int result = 1;
			result = prime * result + getOuterType().hashCode();
			result = prime * result + ((city == null) ? 0 : city.hashCode());
			result = prime * result
					+ ((country == null) ? 0 : country.hashCode());
			return result;
		}
		@Override
		public boolean equals(Object obj) {
			if (this == obj)
				return true;
			if (obj == null)
				return false;
			if (getClass() != obj.getClass())
				return false;
			City other = (City) obj;
			if (!getOuterType().equals(other.getOuterType()))
				return false;
			if (city == null) {
				if (other.city != null)
					return false;
			} else if (!city.equals(other.city))
				return false;
			if (country == null) {
				if (other.country != null)
					return false;
			} else if (!country.equals(other.country))
				return false;
			return true;
		}
		private ResultSetStreamTest getOuterType() {
			return ResultSetStreamTest.this;
		}
	}

	private String[][] data = new String[][]{
			{"Karachi", "Pakistan"},
			{"Istanbul", "Turkey"},
			{"Hong Kong", "China"},
			{"Saint Petersburg", "Russia"},
			{"Sydney", "Australia"},
			{"Berlin", "Germany"},
			{"Madrid", "Spain"}
		};

	private int timesCalled;
	private PreparedStatement mockPST;
	private ResultSet mockRS;

	@Before
	public void setup() throws SQLException{
		timesCalled = -1;
		mockRS = mock(ResultSet.class);
		mockPST = mock(PreparedStatement.class);

		when(mockRS.next()).thenAnswer(new Answer<Boolean>() {

			@Override
			public Boolean answer(InvocationOnMock invocation) throws Throwable {
				if (timesCalled++ &gt;= data.length)
					return false;
				return true;
			}
		});

		when(mockRS.getString(eq("city"))).thenAnswer(new Answer<String>() {

			@Override
			public String answer(InvocationOnMock invocation) throws Throwable {
				return data[timesCalled][0];
			}
		});
		when(mockRS.getString(eq("country"))).thenAnswer(new Answer<String>() {

			@Override
			public String answer(InvocationOnMock invocation) throws Throwable {
				return data[timesCalled][1];
			}
		});

		when(mockPST.executeQuery()).thenReturn(mockRS);
	}

	@Test
	public void simpleTest() throws SQLException{

		try (Stream<City> testStream = new ResultSetStream<City>().getStream(mockPST,
				(ResultSet rs) -> {try {
					return new City(rs.getString("city"), rs.getString("country"));
				} catch (Exception e) {
					return null;
				}})){

			Iterator<City> cities = testStream.filter(
					city -> !city.getCountry().equalsIgnoreCase("China"))
					.limit(3).iterator();

			assertTrue(cities.hasNext());
			assertEquals(new City("Karachi", "Pakistan"), cities.next());

			assertTrue(cities.hasNext());
			assertEquals(new City("Istanbul", "Turkey"), cities.next());

			assertTrue(cities.hasNext());
			assertEquals(new City("Saint Petersburg", "Russia"), cities.next());

			assertFalse(cities.hasNext());
		}

	}

}

Download full source code on Github.

Java yield-like using Stream API

08/11/2014 1 comment

Several programming languages, such as Ruby or Python to name a few, provides the yield command. Yield provides an effective way, in terms of memory consumption, to create series of values, by generating such values on demand. More information on Python Yield.

Let’s consider a class or method requiring a huge amount of secure random integers. The classical approach would be to create an array or collection of such integers. Yield provides two major advantages over such approach:

  • yield does not require to know the length of the series in advance.
  • yield does not require to store all values in memory.

Fortunately, yield features can be used in Java 8 thanks to Stream API:

 

 

import java.security.NoSuchAlgorithmException;
import java.security.SecureRandom;
import java.util.Date;
import java.util.function.Supplier;
import java.util.stream.Stream;

public class Yield {

	private static final Integer RANDOM_INTS = 10;

	public static void main(String[] args) {

		try (Stream randomInt = generateRandomIntStream()){
			Object[] randomInts = randomInt.limit(RANDOM_INTS)
                                .sorted().toArray();
			for (int i = 0; i < randomInts.length;i++)
				System.out.println(randomInts[i]);
		} catch (NoSuchAlgorithmException e) {
			e.printStackTrace();
		}
	}

	private static Stream generateRandomIntStream()
           throws NoSuchAlgorithmException{
		return Stream.generate(new Supplier() {

			final SecureRandom random = SecureRandom
                                .getInstance("SHA1PRNG");
			boolean init = false;
			int numGenerated = 0;

			@Override
			public Integer get() {
				if (!init){
					random.setSeed(new Date().getTime());
					init = true;
					System.out.println("Seeding");
				}
				final int nextInt = random.nextInt();
				System.out.println("Generated random "
                                         + numGenerated++
                                         + ": " + nextInt);
				return nextInt;
			}

		});
	}

}

Following is the output after provided code snippet is executed:

Seeding
Generated random 0: -896358073
Generated random 1: -1268521873
Generated random 2: 9627917
Generated random 3: -2106415441
Generated random 4: 935583477
Generated random 5: -1132421439
Generated random 6: -1324474601
Generated random 7: -1768257192
Generated random 8: -566921081
Generated random 9: 425501046
-2106415441
-1768257192
-1324474601
-1268521873
-1132421439
-896358073
-566921081
9627917
425501046
935583477

It is easy to see that Supplier is only instantiated one. Of course, we can take advantage of all Stream API features such as limit() and sorted().

The line randomInt.limit(RANDOM_INTS).sorted().toArray() triggers the generation of RANDOM_INTS values which are then sorted and stored as an array.

Compile-time checking JPA queries

JPA provides several alternatives for querying data. Such alternatives may be classified attending to a variety of criteria, eg, language used (SQL vs JPQL) or whether queries are static (compilation time) or dynamic (execution time).

Static queries are defined using annotations @NamedQuery (javax.persistence.NamedQuery) and @NamedQueries (javax.persistence.NamedQueries) in the @Entity class definition itself:

 @NamedQuery(
            name="findAllCustomersWithName",
            query="SELECT c FROM Customer c WHERE c.name LIKE :custName"
    )

On the other hand, EntityManager provides methods createQuery(…) y createNativeQuery(…) which take either a JPQL or a SQL query, respectively.

Thus, queries can be defined both in compilation or execution time.

(Note: It is advisable to always use parametrized queries using methods setParameter(…) from Query to avoid SQL Injection vulnerabilities.

Criteria API

However, JPA provides an alternative approach to query objects: Criteria API. Indeed, one of the motivations to switch to JPA is to deal with objects rather than SQL dialects, isn’t it ?

Let’s look a sample code.

Entity definition:

@Entity
public class User {

 @Id
 private Integer userId;

 @Basic
 @Column(length=15, nullable=false)
 private String name;

 @Basic
 @Column(length=64, nullable=false)
 private String userDigestedPasswd;

 @Basic
 @Column(length=50, nullable=true)
 private String email;

 @Basic
 @Column(nullable=false)
 public Integer privilegeLevel;

 @Basic
 @Column(nullable=false)
 private Boolean active;
}

Let’s query db and check results (using JUnit):

public class UserTest {
 @Test
 public void testUserCriteria(){
EntityManagerFactory emf = null;
EntityManager em = null;
try {
  emf = Persistence.createEntityManagerFactory("criteria");
  em = emf.createEntityManager();
  final CriteriaBuilder cb = em.getCriteriaBuilder();
  final CriteriaQuery<User> q = cb.createQuery(User.class);
  final Root<User> users = q.from(User.class);
  final Predicate condition = cb.equal(users.get("privilegeLevel"), 5);
  q.select(users).where(condition).orderBy(cb.asc(users.get("userId")));
  em.getTransaction().begin();
  List<User> result = em.createQuery(q).getResultList();
  em.getTransaction().commit();

  assertNotNull(result);
  assertEquals(2, result.size());

  assertEquals(1, (int)result.get(0).getUserId());
  assertEquals("Pepe", result.get(0).getName());

  assertEquals(3, (int)result.get(1).getUserId());
  assertEquals("Dolores", result.get(1).getName());} catch (Exception e) {
  fail("Unexpected Exception " + e.getMessage());
} finally {
  if (em != null)
    em.close();
  if (emf != null)
    emf.close();
}
}
}

Following lines show query creation:

final CriteriaBuilder cb = em.getCriteriaBuilder();
final CriteriaQuery<User> q = cb.createQuery(User.class);
final Root<User> users = q.from(User.class);
final Predicate condition = cb.equal(users.get("privilegeLevel"), 5);
q.select(users).where(condition).orderBy(cb.asc(users.get("userId")));

First of all, get a CriteriaBuilder from an EntityManager. Then, get a CriteriaQuery instance, setting the class to hold results. In our case, User.class:

final CriteriaBuilder cb = em.getCriteriaBuilder();
final CriteriaQuery<User> q = cb.createQuery(User.class);

Following, the Entity to run the query against must be set:

final Root<User> users = q.from(User.class);

Now it’s time to set query matching conditions. In the sample code, the condition is just attribute privilegeLevel to be equals to 5:

final Predicate condition = cb.equal(users.get("privilegeLevel"), 5);

Finally, query is built adding conditions on Root. Grouping and sorting options may be set too (ie, ascending sorting is set on userId):

q.select(users).where(condition).orderBy(cb.asc(users.get("userId")));

Please have a look at CriteriaBuilder for different options. Grouping and sorting options may be found at CriteriaQuery.

Using metamodel for compile-time checking

Note the query we have just build requires to keep track of object attributes names. Eg, to build the query, the name of the attribute privilegeLevel is used. However, if attribute name were changed later, the code would compile and only fail at runtime:

final CriteriaQuery<User> q = cb.createQuery(User.class);
final Root<User> users = q.from(User.class);
final Predicate condition = cb.equal(users.get("privilegeLevel"), 5);
q.select(users).where(condition).orderBy(cb.asc(users.get("userId")));

That is no good.

Fortunately, using metamodel, we will be able to build compile-time checked queries. A brief introduction can be found at The Java EE6 Tutorial.

Using metamodel, the code will reference an SingularAttribute of the object rather than using a String holding the object attribute name. So, if object attribute were changed later, the compiler would flag it for us.

First of all, the correspondent metamodel class (EntityType) must be created. Although it can achieved by several ways, probably the easiest one, for openJPA implementation, is to add a openJPA build flag:  -Aopenjpa.metamodel=true.

So we have the class User_ created, which is the correspondent metamodel class for User:

* Generated by OpenJPA MetaModel Generator Tool. **/
package com.wordpress.tododev.criteria.entities;
import javax.persistence.metamodel.SingularAttribute;
@javax.persistence.metamodel.StaticMetamodel
(value=com.wordpress.tododev.criteria.entities.User.class)
@javax.annotation.Generated
(value="org.apache.openjpa.persistence.meta.AnnotationProcessor6",date="Mon Mar 04 16:47:46 CET 2013")
public class User_ {
 public static volatile SingularAttribute<User,Boolean> active;
 public static volatile SingularAttribute<User,String> email;
 public static volatile SingularAttribute<User,String> name;
 public static volatile SingularAttribute<User,Integer> privilegeLevel;
 public static volatile SingularAttribute<User,String> userDigestedPasswd;
 public static volatile SingularAttribute<User,Integer> userId;
}

If such class were added to code repo, any later change to class User would remain unnoticeable. Moreover, it is not a good idea to add auto-generated items to code versioning systems.

Using ant, maven or similar tools, a target could be added to create metamodel classes. Such target should be executed after any change to JPA Entities.

Also possible to use IDE for that. Eg, for those using Eclipse, just need to add the already mentioned compilation flag to Properties->Java Compiler->Annotation Processor and the lib (jar) containing the Annotation Processor for the chosen JPA implementation to section Factory Path within Annotations Processor (could lead to compilation issues in auto mode, provided that metamodel class must be compiled before the code using it).

Let us add another test to the suite. This one will not provide a String containing the attribute name, but use the metamodel class instead:

@Test
 public void testUserCriteriaMetaModel(){
 EntityManagerFactory emf = null;
 EntityManager em = null;
 try {
 emf = Persistence.createEntityManagerFactory("criteria");
 em = emf.createEntityManager();
 final CriteriaBuilder cb = em.getCriteriaBuilder();
 final CriteriaQuery<User> q = cb.createQuery(User.class);
 final Metamodel m = em.getMetamodel();
 final Root<User> user = q.from(m.entity(User.class));
 final Predicate condition = cb.equal(user.get(User_.privilegeLevel), 5);
 q.select(user).where(condition).orderBy(cb.asc(user.get(User_.userId)));

 em.getTransaction().begin();
 List<User> result = em.createQuery(q).getResultList();
 em.getTransaction().commit();

 assertNotNull(result);
 assertEquals(2, result.size());

 assertEquals(1, (int)result.get(0).getUserId());
 assertEquals("Pepe", result.get(0).getName());

 assertEquals(3, (int)result.get(1).getUserId());
 assertEquals("Dolores", result.get(1).getName());
} catch (Exception e) {
 fail("Unexpected Exception " + e.getMessage());
 } finally {
 if (em != null)
 em.close();
 if (emf != null)
 emf.close();
 }
 }

More relevant changes are user.get(User_.privilegeLevel) instead of users.get(“privilegeLevel”) and  user.get(User_.userId) instead of  users.get(“userId”).

Download source code from GitHub

Playing with JerseyTest (Jersey 2.5.1 and DI)

I’m going to try explaining a trivial REST example. The idea is building a basic schema to start playing with Jersey. When I begin to use some framework, I usually develop a test enviroment for failing fast, and that is what I’m going to do.

The next example has these features:

  • Jersey 2.5.1
  • Dependency Injection
  • JUnit for testing

Classes:

  • Resource: it will attend the HTTP calls.
  • Service: it’s an interface with two implementations, Impl1 and Impl2.
  • ServiceProvider: it will give the apropiate implementation of Service per each request call in runtime.
  • TestBinder: it set the bindings into the Resource.

 


import static org.junit.Assert.assertEquals;

import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.core.Application;
import javax.ws.rs.core.Response;

import org.glassfish.hk2.api.Factory;
import org.glassfish.hk2.utilities.binding.AbstractBinder;
import org.glassfish.jersey.process.internal.RequestScoped;
import org.glassfish.jersey.server.ResourceConfig;
import org.glassfish.jersey.test.JerseyTest;
import org.junit.Test;

public class JerseyInjectionTest extends JerseyTest {

	private static final String EXPECTED_CONTENT = "any string :P";

	/**
	 * Checks that the Resource uses Impl1.class
	 */
	@Test
	public void invokeImpl1(){
		invoke(Impl1.class);
	}
	
	/**
	 * Checks that the Resource uses Impl2.class
	 */
	@Test
	public void invokeImpl2(){
		invoke(Impl2.class);
	}
	
	/**
	 * Checks that Resource.anyContent has always the value of EXPECTED_CONTENT
	 */
	@Test
	public void checkContent(){
		Response response = target("example/content").request().get();
		assertEquals(EXPECTED_CONTENT, response.readEntity(String.class));
	}
	
	private <T extends Service> void invoke(Class<T> service){
		final String serviceName = service.getName();
		Response response = target("example/"+serviceName).request().get();
		assertEquals(service.getName(), response.readEntity(String.class));
	}
	
	/**
	 * Register the Resource and TestBinder in the Application
	 */
	@Override
	protected Application configure() {
		return new ResourceConfig() {
			{
				register(new TestBinder());
				register(Resource.class);
			}
		};
	}

	@Path("/example")
	public static class Resource {

		@Inject
		Service service;
		@Inject
		String anyContent;

		/**
		 * Returns the name of the Service's implementation
		 */
		@GET
		@Path("/{serviceClass}")
		public Response getDynamicInvokedService() {
			return Response.ok(service.getClass().getName()).build();
		}

		/**
		 * Returns always the value of anyContent
		 */
		@GET
		@Path("/content")
		public Response getStaticContent() {
			return Response.ok(anyContent).build();
		}

	}
	
	/**
	 * This class will help Resource to set the @Inject fields.
	 */
	public static class TestBinder extends AbstractBinder{

		@Override
		protected void configure() {
			bindFactory(ServiceProvider.class).to(Service.class);
			bind(EXPECTED_CONTENT).to(String.class);
		}
		
	}

	/**
	 * This class will instance a Services's implementation
	 * per each time that the Resource is called.
	 */
	@RequestScoped
	public static class ServiceProvider implements Factory<Service> {

		private final String serviceName;

		public ServiceProvider(@PathParam("serviceClass") String serviceName) {
			this.serviceName = serviceName;
		}

		@Override
		public void dispose(Service arg0) {}

		@Override
		public Service provide() {
			try {
				return (Service) Class.forName(serviceName).newInstance();
			} catch (Exception e) {
				return null;
			}
		}

	}

	/**
	 * Dummy services
	 */
	public static interface Service {}
	public static class Impl1 implements Service {}
	public static class Impl2 implements Service {}

}

Now we can try new features easily.

I hope that helps.

Detecting and Fixing XSS using OWASP tools

Much have been written about XSS vulnerabilities scanning. In this article we will try to go a little further and show how to fix them.

To illustrate the whole process, going from initial detection to providing a fix, we will use a very simple app consisting of two JSP pages: one is a payment form for credit card transactions and contains some XSS exploitable code. The other one has such code fixed: the later is just a patched version of the former. We will see how an attacker could trick users, exploiting the present XSS vulnerability, to collect their credit card data.

(Download vulnerable app)

XSS Attacks

 

The goal of XSS attacks is to have a injected script executed by the user web browser. In most cases, user is not even aware of what is going on. For further information about XSS please have a look at OWASP XSS Attack Description.

Let us have a look at our sample app. The vulnerable JSP (xss_html.jsp) contains the following code fragment:


<% 
final String amount = request.getParameter("amount"); 
Enumeration pNames = request.getParameterNames(); 
while (pNames.hasMoreElements()){     
   final String pName = pNames.nextElement();     
   final String pVal = request.getParameter(pName);     
%>
    <input type="hidden" name="<%=pName%>" />" value="" />;</pre>
<table>
<tbody>
<tr>
<td>Credit card</td>
<td><input type="text" maxlength="16" name="cc" size="16" value="" /></td>
</tr>
<tr>
<td>Exp Date (mm/yy)</td>
<td><input type="text" maxlength="2" name="expMonth" size="2" value="" />
/<input type="text" maxlength="2" name="expYear" size="2" value="" /></td>
</tr>
<tr>
<td>CVV2</td>
<td><input type="text" maxlength="2" name="expMonth" size="2" value="" />
/<input type="text" maxlength="2" name="expYear" size="2" value="" /></td>
</tr>
<tr>
<td colspan="2"><input id="button1" type="submit" name="button1" value="Pay" /></td>
</tr>
</tbody>
</table>

The form receives amount to charge as an HTTP Request parameter, collects credit card data form user and them charges her (of course, such last step is not included, you can try with any non-real credit card data). Users could be redirected here from any ecommerce site using a URL such as http://……/XSS_Vulnerable/xss_html.jsp?amount=12.25 (again, nobody would choose such a path for a payment gateway pretending to be a trusted one). But let us see what happens if an attacker tricks someone into loading a malicious URL as the one included in index.jsp instead when the user press Pay button (Firefox 24.0 for Linux):

firefox_vulnerable

Just an alert, but injected Javascript code could have created an image or some other kind of link, so the attacker would have been able to collect the data by just looking at Apache access logs. Indeed, some browsers are able to identify such threats and will not execute injected scripts, as shown (Chromium 28.0 for Linux).

chrome_refuses_exec_script

XSS Detection

 

Fortunately, there are a lot of tools that perform XSS threats scanning so, for the most common issues, there is no need to look at every line of code in every web page when trying to locate such vulnerabilities. One of those tools is OWASP Zed Attack Proxy Object (ZAP). Although it would not be fair to say it is just a XSS scanner, as it provides many many more interesting features.

ZAP can be used as a proxy (indeed, it is based on older Paros Proxy) being able to scan all pages accesed during the session. However, we are just going to introduce the URL( http://……/XSS_Vulnerable/xss_html.jsp?amount=12.25) and press the Attack button (we are using ZAP 2.2.2). To avoid several warnings and making scanning faster we disabled all scan types except for XSS.

ZAP_XSS_vulnerable

Starting from provided URL ( http://……/XSS_Vulnerable/xss_html.jsp?amount=12.25) ZAP has made several checks adding javascript code to be injected.

The tab at the bottom shows one successful attempt of a XSS attack. ZAP has replaced the numeric value of amount parameter by and URL encoded javascript code (as seen in URL field) which is just “><script>alert(1);</script> in plain text (as seen in Evidence field).

Moreover, in the tab above, where HTTP response is shown, the result of the XSS attack is clearly shown: the injected code in amount parameter first closes the double quotes (“) around value for amount field and closes HTML input (>). Afterwards, it adds the script alert(1); (<script>alert(1);</script>). The resulting HTML code to be executed by web browser contains:

<input type="hidden" name="amount" value=""><script>alert(1);</script>" />

XSS Fix

 
Although there is not a single fix for all XSS attacks, all of them are based on input validation, where “input” could be any from HTTP Request parameters, HTTP Headers values or even names… all depends on what the code uses as input.

In our sample app, a HTTP Request parameter is being used to write HTML code.

OWASP provides OWASP Enterprise Security API (ESAPI) in several languages, including, of course Java. ESAPI includes much more functionality related to security, from XSS and CSRF to crypto.

To fix our XSS vulnerability, we are just using a ESAPI encoder (ESAPI 2.1.0). The fix is based on writing the received amount parameter HTML encoded instead of as just received. This way, the user web browser will not execute the javascript code, as it will be seen as the value of the amount parameter.

The fix requires just HTML encoding the amount parameter (see xss_html_esapi.jsp) as follows:

<form method="POST" name="sendForm" id="sendForm" onsubmit="return sendPaymentRequest()">

<%
final String amount = request.getParameter("amount");
Enumeration<String> pNames = request.getParameterNames();
while (pNames.hasMoreElements()){
    final String pName = pNames.nextElement();
    final String pVal = request.getParameter(pName);

    final org.owasp.esapi.Encoder esapiEnc = DefaultEncoder.getInstance();
    final String encPVal = esapiEnc.encodeForHTML(pVal);

    %>
    <input type="hidden" name="<%=pName%>" value="<%=encPVal%>" />
    <%
}
%>

<table>

Running ZAP against fixed JSP (xss_html_esapi.jsp) does not report XSS Vulnerabilities.

Sustainable peace with database changes into a Java environment

12/08/2013 Leave a comment

Sustainable peace for us is remove uncertainty. In this case over database changes the idea Active Record Migrations of Ruby was welcomed.

And what does migration means for us?. Well, it is a convenient way to alter our database schema overtime in a consistent and easy way that removes a lot of uncertainty about database changes in our software development process.

Goal

Our goal will be maintaining the lifecycle of the database according to the development and evolution of the project with an absolute control over the changes.

For this we have to look for a simple tool with a basic group of characteristics as the following ones:

  • Works with any database although now our database is MySQL.
  • Enable concurrent developers to work independently.
  • Enable different development environments.
  • Able to integrate with any version control system.
  • Able to integrate easily migration tasks into Apache Ant.
  • Allow forward and backward migrations and conflicts easily manageable.

We select MyBatis Migrations tool as the best solution for us and a GitHub repository Ant Script to run MyBatis Migrations’ commands as a start line.

Let’s go to the point: How we work with migrations

With these tools we think that a lifecycle of migration may be like this one

The first time
  • Create a migrations directory into our project directory.
  • Download MyBatis Schema migrations file mybatis-migrations-3.1.1-bundle.zip.
  • Create a lib directory and copy mybatis-3.2.3.jar and mybatis-migrations-3.1.1.jar files.
  • Download Ant tasks build.properties and build.xml files from mybatis-migrations-anttasks-master.zip and rename it as migrations.properties/xml for clearer goals.
  • Obviously, this files define ant tasks and basic properties for migrations tool while migrations.properties (comments are included for clearly) defines
    
    # Default environment
    mybatis.default.environment=development
    
    mybatis.dir=migrations
    mybatis.lib.dir=${mybatis.dir}/lib
    
    mybatis.repository.dir=${mybatis.dir}/db
    
    # This directory contains your migration SQL files. These are the files 
    # that contain your DDL to both upgrade and downgrade your database 
    # structure. By default, the directory will contain the script to 
    # create the changelog table, plus one empty example migration script. 
    mybatis.scripts.dir=${mybatis.repository.dir}/scripts
    
    # Place your JDBC driver .jar or .zip files in this directory.
    # Upon running a migration, the drivers will be dynamically loaded.
    mybatis.drivers.dir=${mybatis.repository.dir}/drivers
    
    # In the environments folder you will find .properties files that 
    # represent your database instances. By default a development.properties 
    # file is created for you to configure your development time database 
    # properties.
    # You can also create test.properties and production.properties 
    # files. The properties file is self documented.
    mybatis.env.dir=${mybatis.repository.dir}/environments
    
    

    and migrations.xml defines ant tasks as you can see in the original documentation. Of course, you must rename it as xml file descriptor property to load it

    <?xml version="1.0" encoding="UTF-8"?>
    <project name="MyBatis Migrations" basedir="." 
             default="db:migrate:status">
    
    	<property file="migrations/migrations.properties" />
    
    .....
    </project>
    
  • But, how to install it … It’s easy, basically we have to execute:
    $ ant -f migrations.xml db:migrate:init
    

    It creates directories and the initial files as they were defined in migrations.properties as you can see in this output log

    Buildfile: /wpr/myproject/migrations/migrations.xml
    
    db:migrate:init:
         [echo] ** Executing "migrate init" on "development" environment **
         [java] ------------------------------------------------------------
         [java] -- MyBatis Migrations - init
         [java] ------------------------------------------------------------
         [java] Initializing: db
         [java] Creating: environments
         [java] Creating: scripts
         [java] Creating: drivers
         [java] Creating: README
         [java] Creating: development.properties
         [java] Creating: bootstrap.sql
         [java] Creating: 20131123174059_create_changelog.sql
         [java] Creating: 20131123174100_first_migration.sql
         [java] Done!
         [java] 
         [java] ------------------------------------------------------------
         [java] -- MyBatis Migrations SUCCESS
         [java] -- Total time: 2s
         [java] -- Finished at: Sat Nov 23 18:41:00 CET 2013
         [java] -- Final Memory: 1M/117M
         [java] ------------------------------------------------------------.
    
    BUILD SUCCESSFUL
    Total time: 3 seconds
    

    while

    • environments, scripts and drivers are directories (as seen before).
    • README, that explains directories contents as the name suggests.
    • bootstral.sql, in which you have to include the database actual schema. You need to start from a known state.
    • 20131123174059_create_changelog.sql contains a default control table for migration tool. It’s a price that you have to pay.
    • 20131123174100_first_migration.sql will be your first SQL migration file. You can delete it or rename it for clearly although you must keep the format as yyyymmddHHMMss_.
  • Keep migrations/db/environment/development.properties database properties for development environment
    ## JDBC connection properties. 
    driver=com.mysql.jdbc.Driver
    url=jdbc:mysql://localhost:3306/<databaseName>
    username=root
    password=root
    
  • Add others environment properties files to each migrations/db/environment/<environment>.properties if you need.
  • As last step, put your actual database schema into bootstrap.sql file.
Day by day

Among all migrate commands we normally use

Optional steps included:

  • Revert migrations if necessary to solve conflicts. Any mistake has an easy solution with db:migrate:down .. but remember that it is by single steps.
  • Apply pending migrations out of order if it’s safe to do so with db:migrate:pending or db:migrate:version. Actually, if you want to execute those tasks you will have to add the code belong into migrations.xml
    <?xml version="1.0" encoding="UTF-8"?>
    <project name="MyBatis Migrations" basedir="." default="db:migrate:status">
    ....
    
    	<!-- $ migrate pending -->
    	<target name="db:migrate:pending" description="Runs all pending migrations regardless of their order or position in the status log">
    		<migrate command="pending" environment="${environment}" />
    	</target>
    
    	<!-- $ migrate version -->
    	<target name="db:migrate:version" description="Migrate the schema to any specific version">
    		<input addproperty="specific.version" message="Specific version to migrate:" />
    		<migrate command="version" environment="${environment}">
    			<extraarguments>
    				<arg value="${specific.version}" />
    			</extraarguments>
    		</migrate>
    	</target>
    
    </project>
    
  • Generate migration scripts to be run “offline” in environments that are beyond your control.
  • Get the status of the system at any time doing db:migrate:status.

We hope you find useful our solution, all comments are welcomed and apologies for my english.

Hacer un proxy de un objeto

Estaba ojeando la parte de reflexión de Guava y he descubierto lo facil que es hacer un Proxy de un Objeto. El inconveniente es que solo se pueden hacer proxys de una interfaz, pero hay forma de poderlo apañar.

Por ejemplo, Date no es ninguna interfaz, y quiero hacer fechas que tengan los milisegundos a cero. Para ello me hago una interfaz Proxy que voy a rellenar con mi nuevo codigo.


import static org.junit.Assert.assertEquals;

import java.lang.reflect.InvocationHandler;
import java.lang.reflect.Method;
import java.util.Date;

import org.junit.Test;

import com.google.common.reflect.Reflection;

public class ProxyTest {

	@Test
	public void dateZeroMillis() {
		Date withMillis = new Date();
		@SuppressWarnings("unchecked")
		Proxy<Date> proxyObject = Reflection.newProxy(Proxy.class,
				new RemoveMillis(withMillis));
		Date date = proxyObject.get();
		System.out.println("Date was: " + withMillis.getTime()
				+ " but now is: " + date.getTime());
		assertEquals(0, date.getTime() % 1000);
	}

	public static class RemoveMillis implements InvocationHandler {

		private Date date;

		public RemoveMillis(Date date) {
			this.date = date;
		}

		@Override
		public Object invoke(Object proxy, Method method, Object[] args)
				throws Throwable {
			return new Date((date.getTime() / 1000) * 1000);
		}

	}

	public static interface Proxy<T> {
		T get();
	}
}

La salida del test es esta:
Date was: 1383914151227 but now is: 1383914151000

Categories: Uncategorized
Follow

Get every new post delivered to your Inbox.

%d bloggers like this: