Stability of impulsive differential systems
The asymptotic phase property and reduction principle for stability of a trivial solution is generalized to the case of the noninvertible impulsive differential equations in Banach spaces whose linear parts split into two parts and satisfy the condition of separation.