i am not sure how i should respond to this but i will elaborate more on the rational root theorem that has been mentioned.
for a given polynomial (a_n)x^n+(a_(n-1))x_(n-1)+...+(a_0) of degree n, where all the coefficents are integers, in order to find potential candidates for the roots of the polynomial, you should first consider the factorzation of the leading coefficient (a_n) and the constant term (a_0).
for instance, given p(x)=4x^3-4x^2-x+1, all the possible factors of 4 are A={1, -1, 2, -2, 4, -4} and that of 1 are B={1, -1}
next divide every element in B by A to form a set of possible roots of p(x)=0. i.e.{1/1, -1/1, 1/2, -1/2, 1/4, -1/4}
you can check which of the six numbers are roots of p(x)=0. in this case, one can easily verify 1, 1/2, and -1/2 are indeed the three roots of p(x)=0.
note that the list of possible roots obtained by division may not be exhasive (i.e. in some cases, the some of the roots may be irrational and you have to factorize the polynomial to get all the roots)
for instance, given p(x)=x^3-x^2-2x+2
all the possible factors of 1 are A={1, -1}, and that of 2 are B={1, -1, 2, -2}
by the same procedure illustated above, the set of possible roots of p(x)=0 is {1, -1, 2, -2}
one can check that x=1 is a solution to p(x)=0. by factor theorem, (x-1) is a factor of p(x). using long division,
p(x)=(x-1)(x^2-2)
and thus the roots of p(x)=0 are x=1, sqrt(2), -sqrt(2).
when using the theorem, one should keep in mind that any polynomial of degree n must have exactly n roots (real and conplex).
matrix addition and subtraction is very easy. all you have to do is to add or subtract the elements at the same position in each matrix.
for matrix multiplication, if you wish to compute AB, where AB makes sense, perhaps it is helpful to think to the muliplication procedure as mulitplying the column vectors in B by A, i.e. AB=[A(b_1) A(b_2) ... A(b_n)],
B= [b_1 b_2 ... b_n]
maybe there is a more convenient way to help you remember how to do the muliplication.