A validated method is presented for computation of Lipschitz constant of recurrent neural networks. Lipschitz estimation of neural networks has gained prominence due to its close links with robustness analysis, a central concern in modern machine learning, especially in safety-critical applications. In recent years, several methods for validated Lipschitz estimation of feed-forward networks have been proposed, yet there are relatively fewer methods available for recurrent networks. In the current article, based on interval enclosure of Clarke's generalized gradient, a method is proposed for Lipschitz estimation of recurrent networks which is applicable to both differentiable and non-differentiable networks. The method has a firm foundation in domain theory, and the algorithms can be proven to be correct by construction. A maximization algorithm is devised based on bisection with which a certified estimate of the Lipschitz constant can be obtained, and the region of least robustness can be located in the input domain. The method is implemented using interval arithmetic, and some experiments on vanilla recurrent networks are reported.