INTEGER IA(5), JA(8), IPARM(12), IWKSP(12)
REAL A(8), RHS(4), u(4), WKSP(24), RPARM(12)
DATA A(1),A(2),A(3),A(4) / 4.0,-1.0,-1.0,4.0 /
DATA A(5),A(6),A(7),A(8) / -1.0,4.0,-1.0,4.0 /
DATA JA(1),JA(2),JA(3),JA(4) / 1,2,3,2 /
DATA JA(5),JA(6),JA(7),JA(8) / 4,3,4,4 /
DATA IA(1),IA(2),IA(3),IA(4),IA(5) / 1,4,6,8,9 /
DATA RHS(1),RHS(2),RHS(3),RHS(4) / 6.0,0.0,0.0,6.0 /
DATA N /4/, NW /24/, ITMAX /4/, LEVEL/1/, IDGTS/2/
C
CALL DFAULT (IPARM, RPARM)
IPARM(1) = ITMAX
IPARM(2) = LEVEL
IPARM(12) = IDGTS
CALL VFILL (N, U, 0.E0)
CALL JCG (N,IA,JA,A,RHS,U,IWKSP,NW,WKSP,IPARM,RPARM,IER)
STOP
END
The output for this run would be
BEGINNING OF ITPACK SOLUTION MODULE JCG
JCG HAS CONVERGED IN 2 ITERATIONS.
APPROX. NO. OF DIGITS (EST. REL. ERROR) = 14.6 (DIGIT1)
APPROX. NO. OF DIGITS (EST. REL. RESIDUAL) = 14.3 (DIGIT2)
SOLUTION VECTOR.
1 2 3 4
-------------------------------------------------------------
2.00000E+00 1.00000E+00 1.00000E+00 2.00000E+00
Textbook methods such as the Jacobi (J), Gauss-Seidel (GS), Successive
Overrelaxation (SOR--fixed relaxation factor omega), Symmetric
Successive Overrelaxation (SSOR--fixed relaxation factor omega), and
the RS method can be obtained from this package by resetting appropriate
parameters after the subroutine DFAULT() is called but before
ITPACK routines are called.
Method
Use
Parameters
J
JSI()
![]()
GS
SOR()
![]()
SOR--fixed omega
SOR()
![]()
SSOR--fixed omega
SSORSI()
![]()
RS
RSSI()
![]()
These methods were not included as separate routines because they are
usually slower than the accelerated methods included in this package.
On the black unknowns, the Cyclic Chebyshev Semi-Iterative (CCSI)
method of Golub and Varga [2] gives the same result as the RSSI
method. The CCSI and RSSI methods converge at the same rate, and each
of them converges twice as fast as the JSI method. This is a theoretical
result [6] and does not count the time involved in establishing
the red-black indexing and the red-black partitioned system. Similarly,
the Cyclic Conjugate Gradient (CCG) method with respect to the black
unknowns, considered by Reid [16] (see also Hageman and Young
[6]), gives the same results as the RSCG method. Also, the CCG and
the RSCG methods converge at the same rate, and each of them converges,
theoretically, exactly twice as fast as the JCG method. Hence, the
accelerated RS methods are preferable to the accelerated J methods
when using a red-black indexing.