> Y MbjbjWW ==]8t[*(@@@~3,_o[q[q[q[q[q[q[$]_[w"ww[W@@S*WWWwH&@@o[wo[W*W
{P
o[@ ZDistortionFree Data Embedding for Images
1Miroslav Goljan, 2Jessica J. Fridrich, 1Rui Du
1Dept. of Electrical Engineering, SUNY Binghamton, Binghamton, NY 13902
2Center for Intelligent Systems, SUNY Binghamton, Binghamton, NY 13902
{bg22976,fridrich,bh09006}@binghamton.edu
Abstract. One common drawback of virtually all current data embedding methods is the fact that the original image is inevitably distorted by some small amount of noise due to data embedding itself. This distortion typically cannot be removed completely due to quantization, bitreplacement, or truncation at the grayscales 0 and 255. Although the distortion is often quite small, it may not be acceptable for medical imagery (for legal reasons) or for military images inspected under unusual viewing conditions (after filtering or extreme zoom). In this paper, we introduce a general approach for highcapacity data embedding that is distortionfree (or lossless) in the sense that after the embedded information is extracted from the stegoimage, we can revert to the exact copy of the original image before the embedding occurred. The new method can be used as a powerful tool to achieve a variety of nontrivial tasks, including distortionfree robust watermarking, distortionfree authentication using fragile watermarks, and steganalysis. The proposed concepts are also extended to lossy image formats, such as the JPG.
1 Introduction
Data embedding applications could be divided into two groups depending on the relationship between the embedded message and the cover image. The first group is formed by steganographic applications in which the message has no relationship to the cover image and the only role the cover image plays is the one of a decoy to mask the very presence of communication. The content of the cover image has no value to the sender or the decoder. Its main purpose is to mask the secret embedded message. In this typical example of a steganographic application for covert communication, the receiver has no interest in the original cover image before the message was embedded. Thus, there is no need for distortionfree data embedding techniques for such applications.
The second group of applications is frequently addressed as digital watermarking. In a typical watermarking application, the message has a close relationship to the cover image. The message supplies additional information about the image, such as image caption, ancillary data about the image origin, author signature, image authentication code, etc. While the message increases the practical value of the image, the act of embedding inevitably introduces some amount of distortion. It is highly desirable that this distortion be as small as possible while meeting other requirements, such as minimal robustness and sufficient payload. Models of the human visual system are frequently used to make sure that the distortion due to embedding is imperceptible to the human eye. There are, however, some applications for which any distortion introduced to the image is not acceptable. A good example are medical images, where even small modifications are not allowed for obvious legal reasons and a potential risk of a physician misinterpreting an image. As another example, we mention law enforcement and military image analysts who may inspect imagery under special viewing conditions when typical assumptions about distortion visibility do not apply. Those conditions include extreme zoom, iterative filtering, and enhancement.
Until recently, almost all data embedding techniques, especially highcapacity data embedding techniques, introduced some amount of distortion into the original image and the distortion was permanent and not reversible. As an example, we can take the simple Least Significant Bit (LSB) embedding in which the LSB plane is irreversibly replaced with the message bits. In this paper, we present a solution to the problem of how to embed a large payload in digital images in a lossless (invertible) manner so that after the payload bits are extracted, the image can be restored to its original form before the embedding started. Even though the distortion is completely invertible, we pay close attention to minimizing the amount of the distortion after embedding. We note that in this paper, the expressions "distortionfree", "invertible", and "lossless" are used as synonyms.
The ability to embed data in an image in a lossless manner without having to expand the image or append the data is quite useful. Data embedded in a header or a separate file can be easily lost during file format conversion or resaving. Additional information embedded directly in the image as additional lines or columns may cause visually disturbing artifacts and increases the image file size. In contrast, information that is embedded in the image is not modified by compatible format conversion or resaving, no bandwidth increase is necessary to communicate the additional information, and a better security is obtained because the embedded information is inconspicuous and imperceptible. For increased security, a secret key can protect the embedding process.
In the next section, we briefly describe prior relevant techniques and discuss their limitations. Section 3 contains a detailed exposition of the algorithms for the new lossless data embedding method. Capacity estimates and some sample results are also provided in the same section. Further analysis and improvements are presented in Section 4. We also give an alternative interpretation of the new method that enables a formal analysis of the proposed algorithms. In Section 5, we experimentally investigate the relationship between the capacity and distortion, and the influence of image noise on the capacity. Section 6 discusses several important applications of distortionfree data embedding, including invertible fragile image authentication, distortionfree robust watermarking, extension of distortionfree embedding techniques to lossy formats, and steganalysis. The paper is concluded in Section 7.
2 Prior Art
As far as the authors are aware, the concept of distortionfree data embedding appeared for the first time in an authentication method in a patent owned by The Eastman Kodak [5]. The authors describe a fragile invertible authentication method that utilizes a robust watermark in the spatial domain [6]. Their watermarking technique is a spatial additive nonadaptive scheme in which the addition has been replaced with addition modulo 255. The payload of the watermark is the hash H(I) of the original image I. It is important that the watermark pattern W be only a function of the hash and a secret key K, W = W(H(I),K). The watermark pattern W is added to the original image I using modulo addition
Iw = I + W mod 255 ,
where Iw is the watermarked image. The verification process starts with extracting the watermark payload H' (a candidate for the hash), computing the watermark pattern W' from the extracted payload and the secret key, W' = W(H', K), and subtracting the watermark W' from the watermarked image Iw modulo 255. Finally, the hash of the result is calculated. Only when the calculated hash matches the extracted hash (payload), the image is deemed authentic:
H[Iw ( W(H',K) mod 255] = H' ( Iw is authentic
H[Iw ( W(H',K) mod 255] ( H' ( Iw is not authentic.
We point out that if the image is authentic, the original image data I is obtained. The addition modulo 255 may introduce some disturbing artifacts resembling a correlated saltandpepper noise when pixels with grayscales close to zero are flipped to values close to 255 and vice versa. For many typical images, however, the number of flipped pixels is small and the (invertible) artifacts are not that disturbing. The watermarking technique needs to be robust with respect to the saltandpepper noise and this is the only distortion with respect to which the watermark needs to be robust. If the number of flipped pixels is too large, such as for astronomical images, the authenticated image may not be correctly verified as authentic. Such images cannot be authenticated with this technique. The problem can be significantly alleviated by attempting to identify candidates for flipped pixels and replacing them with a more likely value before extracting the payload H'. For more detailed analysis and further generalization of this technique, the reader is referred to our previous paper on invertible authentication [1].
In the same paper, we have introduced a different method for invertible authentication and distortionfree data embedding based on lossless compression of bitplanes. In this method, we start with the lowest bitplane and calculate its redundancy defined as the difference between the number of pixels and the same bitplane compressed with the JBIG lossless compression method. We proceed to higher bitplanes till the redundancy becomes greater or equal to the payload that needs to be embedded. If this technique is used for authentication, only 128 bits (for MD5 hash) need to be embedded. Most high quality images can be authenticated in the lowest three bitplanes. Noisy images may require using the 4th or the 5th bitplane. Once the bitplane is found, the compressed bitplane and the payload are encrypted and inserted in the same bitplane. Extraction (or verification) proceeds in the reverse order. The bitplane is first decrypted, the hash is extracted, and the compressed bitplane decompressed. The encrypted bitplane is replaced with the decompressed original bitplane and the hash of the image is compared to the extracted hash. Again, if the two hashes match, the image deemed authentic, otherwise it is not. Only images that do not have losslessly compressible structure in all bitplanes cannot be authenticated. The capacity of this technique can be easily traded for distortion by choosing different bitplanes but the artifacts can quickly become visible depending on the message length and the noisiness of the original image.
Macq [7] described a modification to the patchwork algorithm to achieve lossless watermark embedding. He also uses addition modulo 255 and essentially embeds one bit watermark. It is unclear if this technique could be used for authentication or general data embedding with practical payloads.
In the next section, we present a new, simple, and elegant lossless data embedding method that allows relatively large payloads while making very small modifications to the image.
3 DistortionFree HighCapacity Data Embedding Method
The reason why most data embedding techniques cannot be completely reversed is the loss of information due to discarded (replaced) information, quantization, and integer rounding at the boundaries of the grayscale range (at zero and 255 gray levels). Most highcapacity data embedding techniques are based on either bitreplacement or quantization. However, there is little hope that a distortionfree data embedding scheme could be constructed from such schemes. Additive nonadaptive schemes are almost lossless except for the pixels with grayscales close to 0 or 255 where truncation has occurred. Modulo addition proposed in [1,5,7] can solve the problem at the expense of introducing very visible artifacts. Another drawback of lossless data embedding based on additive robust watermarks is their very limited capacity.
In our previous work [1], we proposed an invertible fragile watermark for image authentication based on lossless compression of bitplanes. The idea behind this method is to "make some space" in the image by losslessly compressing a bitplane with some minimal compressible structure. The newly created space can be used for embedding additional message. However, higher payloads force us to use higher bitplanes, thus quickly increasing the distortion in the image beyond an acceptable level. In this paper, instead of using bitplanes we generate losslessly compressible bitstreams using the concepts of invertible noise adding (flipping) and special discrimination (prediction) functions on small groups of pixels. The new approach is much more efficient allowing for large payloads with minimal (invertible) distortion.
Let us assume that the original image is a grayscale image with M(N pixels and with pixel values from the set P. For example, for an 8bit grayscale image, P = {0, , 255}. We start with dividing the image into disjoint groups of n adjacent pixels (x1, , xn). For example, we can choose groups of n=4 consecutive pixels in a row. We also define so called discrimination function f that assigns a real number f(x1, , xn)(R to each pixel group G = (x1, , xn). The purpose of the discrimination function is to capture the smoothness or "regularity" of the group of pixels G. As pointed out at the end of Section 4, image models or statistical assumptions about the original image can be used for design of other discrimination functions. For example, we can choose the 'variation' of the group of pixels (x1, , xn) as the discrimination function f:
EMBED Equation.3 . (1)
Finally, we define an invertible operation F on P called "flipping". Flipping will be a permutation of gray levels that entirely consists of twocycles. Thus, F will have the property that F2 = Identity or F(F(x)) = x for all x(P.
We use the discrimination function f and the flipping operation F to define three types of pixel groups: R, S, and U
Regular groups: G ( R ( f(F(G)) > f(G)
Singular groups: G ( S ( f(F(G)) < f(G)
Unusable groups: G ( U ( f(F(G)) = f(G).
In the expression F(G), the flipping function F is applied to all (or selected) components of the vector G=(x1, , xn). The noisier the group of pixels G=(x1, , xn) is, the larger the value of the discrimination function becomes. The purpose of the flipping F is perturbing the pixel values in an invertible way by some small amount thus simulating the act of "invertible noise adding". In typical pictures, adding small amount of noise (i.e., flipping by a small amount) will lead to an increase in the discrimination function rather than decrease. Although this bias may be quite small, it will enable us to embed a large amount of information in an invertible manner.
As explained above, F is a permutation that consists entirely of twocycles. For example, the permutation FLSB defined as 0 ( 1, 2 ( 3, , 254 ( 255 corresponds to flipping (negating) the LSB of each gray level. The permutation 0 ( 2, 1 ( 3, 4 ( 6, 5 ( 7, corresponds to an invertible noise with a larger amplitude of two. One can easily visualize that many possible flipping permutations are possible, including those in which the flipping is irregular with several different changes in gray scales rather than just one.
A useful numerical characteristic for the permutation F is its "amplitude". The amplitude A of the flipping permutation F is defined as the average change of x under the application of F:
EMBED Equation.3 . (2)
For FLSB the amplitude is 1. The other permutation from the previous paragraph has A = 2. Larger values of the amplitude A correspond to adding more noise after applying F.
Having explained the logic behind the definitions, we now outline the principle of the new lossless highcapacity data embedding method.
Let us denote the number of regular, singular, and unusable groups in the image as NR, NS, and NU, respectively. We have NR+NS+NU = MN/n. Because real images have spatial structures, we expect a bias between the number of regular groups and singular groups: NR > NS. As will be seen below, this bias will enable us to losslessly embed data. We further note that
if G is regular, F(G) is singular,
if G is singular, F(G) is regular, and
if G is unusable, F(G) is unusable.
Thus, the R and S groups are flipped into each other under the flipping operation F, while the unusable groups U do not change their status. In a symbolic form, F(R)=S, F(S)=R, and F(U)=U.
We can now formulate the data embedding method. By assigning a 1 to R and a 0 to S we embed one message bit in each R or S group. If the message bit and the group type do not match, we apply the flipping operation F to the group to obtain a match. We cannot use all R and S groups for the payload because we need to be able to revert to the exact original image after we extract the data at the receiving end. To solve this problem, we use an idea similar to the one proposed in our previous paper [1]. Before the embedding starts, we scan the image by groups and losslessly compress the status of the image ( the bitstream of R and S groups (the RSvector) with the U groups simply skipped. We do not need to include the U groups, because they do not change in the process of message embedding and can be all unambiguously identified and skipped during embedding and extraction. We take the compressed RSvector C, append the message bits to it, and embed the resulting bitstream in the image using the process described above.
At the receiving end, the user simply extracts the bitstream from all R and S groups (R=1, S=0) by scanning the image in the same order as during the embedding. The extracted bitstream is separated into the message and the compressed RSvector C. The bitstream C is decompressed to reveal the original status of all R and S groups. The image is then processed once more and the status of all groups is adjusted as necessary by flipping the groups back to their original state. Thus, the exact copy of the original image is obtained. The block diagram of the embedding and extracting procedure is given in Fig. 1 below.
EMBED Word.Picture.8 Fig. SEQ Fig. \n 1. Diagram for the distortionfree data embedding and extraction algorithm
The raw information capacity for this data embedding method is NR+NS = MN/n ( NU bits. However, because we need to store the message and the compressed bitstream C, the real capacity Cap that can be used for the message is
EMBED Equation.3 ,
where C is the length of the bitstream C. As the bias between R and S groups increases, the compressed bitstream C becomes shorter and the capacity higher. An ideal lossless contextfree compression scheme (the entropy coder) [8] would compress the RSvector consisting of NR+NS bits using
EMBED Equation.3 bits.
As a result, we obtain a theoretical estimate (an upper bound) Cap' for the real capacity
EMBED Equation.3
This estimate will be positive whenever there is a bias between the number of R and S groups, or when NR ( NS. This bias is influenced by the size and shape of the group G, the discrimination function f, the amplitude of the invertible noisy permutation F, and the content of the original image. The bias increases with the group size n and the amplitude of the permutation F. Smoother and less noisy images lead to a larger bias than images that are highly textured or noisy.
The bias is not, however the parameter that should be optimized for this scheme. The real capacity Cap is the characteristic that should be maximized to obtain the best performance. Our goal is to choose such a combination of the group size n and its shape, the permutation F, and the discrimination function f, in order to maximize the capacity while keeping the distortion to the image as small as possible. The theoretical estimate Cap' for the capacity was experimentally verified on test images using the adaptive arithmetic coder [8] applied to image rows as the lossless compression. It was found that the estimated capacity Cap' matched the real capacity Cap within 15(30 bits depending on the image size.
We have performed a number of experiments to see how the capacity and distortion change with different group sizes and shapes, discrimination functions f, and flipping operations F. It was a rather unexpected result that the highest capacity was obtained for relatively small groups (n ( 4). Another surprising fact was that a quite reasonable capacity could be obtained from the flipping permutation FLSB that influences only the LSBs. And this was true for all images including those that did not show any structure in their LSB plane.
Table SEQ Table \n 1. Estimated capacity Cap' for ten grayscale test images as a function of the amplitude a
Test image name (M(N) Estimated capacity Cap' for amplitude a = 1, , 71234567LennaFace (128(128)17052110451390186519962342Lenna (256(256)1038291650956027766377838988PalmTrees (400(268)916227440204621577866437971GoldenGate (400(268)432589301400114351168651646018341Mountains (400(268)165637906426757596021043212149Desert (400(268)7133109351717016959191341856820095Mandrill (512(512)18670218102905439856647643ElCapitan (592(800)2500122191889826627367744213351430NYC (1024(768)6773177663088337516484345255361614Girl (1024(1536)2550665577109865131994166806176587204761 Average Cap' / M(N1.88%4.11%6.86%7.82%9.72%10.16%11.73%Average PSNR (dB)53.1246.6742.8439.2738.2636.0635.32
In Table 1, we give an example of how the amplitude of the flipping function influences the capacity Cap' and the distortion for ten grayscale images shown in Fig. 2 below. We used groups of n=4 consecutive pixels and seven flipping operations with amplitudes ranging from 1 to 7. We can see a very high variability in capacity between images. Images with abundant highly textured areas and noisy images have generally smaller capacity. It is also very apparent that the capacity increases very fast with amplitude. Further analysis and improvements of the new method are given in the next section.
Fig. SEQ Fig. \n 2. Test images used in Table 1
4 Analysis and Further Generalization
One of the goals set in this paper is to maximize the capacity while keeping the invertible distortion as small as possible. There are several factors that influence the capacitydistortion trade off ( the discrimination function, the flipping operation, and the size and shape of the groups. The influence of the amplitude of the flipping operation is clear. The capacity rapidly increases with the amplitude, which can be seen in Table 1 and in Fig. 3 in Section 5. The role of the size and shape of the groups as well as the choice of the discrimination function is a more complicated issue that will be discussed in this section. We point out a close relationship between image models or a priori assumptions about the image and the discrimination function. Finally, we outline a methodology how to build a theoretical model for the new data embedding method and further optimize its performance.
First, we looked at the influence of the size of the groups. We have found from our experiments that groups of approximately four pixels gave us the best overall capacity for all amplitudes. Groups that are too small will generate too small a bias between the R and S groups and therefore decrease the capacity in spite of the fact that the number of groups increased. Although large groups achieve a larger bias between R and S groups and have fewer U groups, the capacity will decrease due to small number of groups. We have also observed that for smaller amplitudes, the highest capacity was sometimes obtained for group size five, while for larger amplitudes (e.g, larger than 6), smaller groups of only three pixels gave us slightly better results.
A scheme that uses groups of n pixels can never achieve higher capacity than 1/n bits per pixel (bpp). Thus, a natural way to increase the capacity would be to use overlapping groups of pixels rather than disjoint groups. However, overlapping groups will lead to the problem that the pixels that were already modified will influence the status of groups that have not yet been visited. This will not only decrease the bias and complicate the data extraction process but may prevent us from recovering the embedded data altogether. The problem can be avoided by using groups that overlap in pixels that are not flipped during embedding. For example, we could use groups of four pixels in a row and flip only the middle two pixels (but calculate the discrimination function from all four pixels as before). This enables us to use the following overlapping groups of pixels (x1, x2, x3, x4), (x4, x5, x6, x7), (x7, x8, x9, x10), . The maximal possible capacity of this technique is 1/3 bpp as opposed to 1/4 bpp for the disjoint groups of four pixels.
This observation lead us toward designs in which the embedding is done in multiple passes and the groups are intertwined as much as possible, overlapping in possibly many pixels, with only one pixel being flipped. We have tested several interesting designs that gave us significantly higher capacity than the original disjoint groups of four. One of the best and simplest designs was the Checkerboard scheme. In this scheme, the image is divided into 'Black' and 'White' pixels in the same way as the chess checkerboard (pixel xij is Black if i+j is odd, otherwise it is White). The data embedding method uses two passes. In the first pass, we go through all Black pixels xij, i+j mod 2 = 1, skipping the White ones. We flip only the Black pixel but evaluate the discrimination function from its four closest White neighbors
f(xij, xi(1j, xi+1j, xij(1, xij+1) = xij(xi(1j + xij(xi+1j + xij(xij(1 + xij(xij+1 .
In the second pass, we move through the White pixels only and evaluate the discrimination function from their four Black neighbors. Since the Black neighbors have already been modified in the first pass, the capacity for the second pass will be smaller than for the first pass. Nevertheless, the overall capacity of this Checkerboard scheme with FLSB is about 100% higher than the capacity of the scheme with disjoint groups of four from Table 1. The capacity increased from 916 bits to 2128 bits for the image 'PalmTrees', from 1656 bits to 3563 bits for 'Mountains', and from 7133 bits to 13208 bits for 'Desert'. Finally, we mention that the PSNR for both techniques is approximately the same.
The previous paragraph indicates that the choice of the group size and the selection of the pixels that should be flipped can influence the performance of the embedding scheme in a profound manner. If we assign amplitude A=0 to the identity permutation, the group shape and its amplitudes can be conveniently expressed using a mask M=[A1 A2 A3 A4 ], meaning that a predefined permutation with amplitude Ai is applied to the pixel xi, etc. For groups that form a twodimensional pattern, the amplitudes in the mask are listed in a rowbyrow manner. Using this convention, in the paragraphs below we present further important observations.
If the same flipping permutation is applied to all pixels in the group (for example for the mask [1 1 1 1]), the discrimination function (1) would not change in flat areas for which x1 = x2 = x3 = x4. Thus, for images that have large areas of constant color, such as astronomical images or computergenerated images, the capacity would be inconveniently decreased because of too many U groups. While it may be desirable to intentionally avoid areas with no activity, the overall capacity will be decreased. Using different amplitudes for the pixels in one group will turn those U groups from flat areas into R groups and the capacity will be increased.
It is possible to use masks that do not produce any U groups. For example, let us take the mask [1 0 0 0] for the group of 2(2 pixels, where the x1 pixel is the only flipped pixel in the group. The function f = x1(x2+x1(x3+x1(x4 will generate only R or S groups but no U groups because the change in each term is either 1 or (1 and there are three terms. It may appear that the fact that there are no U groups must always lead to an increase in capacity, but this is almost never the case because the bias between R and S groups may worsen thus leading to a smaller overall capacity. From our experience, we found that the presence of U groups is actually beneficial if we want to maximize the capacity.
As the last part of this section, we give another interpretation of the proposed method that will enable us to formulate formal theoretical apparatus for analysis of the proposed distortionfree data embedding scheme.
Let us assume that we have a grayscale image, disjoint groups of n pixels, and a flipping operation F applied to selected pixels in the group. Let S be the set of all possible states of each group consisting of 256n ntuples of integers from the set of grayscales P. The flipping operation F separates S into pairs of states x, y(S, that are flipped into each other F(x)=y, F(y)=x. Let us further assume that we have an image model that enables us to say whether x or y is more likely to occur in natural images. We can denote the group G as regular if its state x is the one that is more likely to occur, singular, if it is the one that is less likely to occur, and unusable if the image model cannot decide whether x or y is more likely to occur. The rest of the embedding and extraction stays the same as described in Section 3. In view of this interpretation, the discrimination function (1) is a special case of an embodiment of an image model derived from the assumption that groups with smaller variance are more likely to occur than groups with higher variance.
This alternative interpretation of our method allows for more efficient embedding procedure and formal mathematical analysis based on image models. For example, natural images are likely to be piecewise linear on small groups. Thus, one can define a discrimination function as a sum of deviations from the best leastsquares local fit to the pixel values in the group. This model, however, exhibited worse results in practical tests. In the future, we intend to investigate discrimination functions derived from Markov image models. Due to lack of space, this discussion is postponed to our forthcoming paper [4].
5 Experimental Results
To obtain a better understanding of how different components and parameters affect the performance of the proposed lossless data embedding method, we present some results in a graphical form. All experiments were performed with five small grayscale test images ('Lenna' with 256(256 pixels, 'PalmTrees', 'GoldenGate', 'Mountains', with 400(268 pixels, and 'NYC' at the resolution 1024(768).
Capacityamplitudedistortion relationship: To explain how the capacity and distortion change with the amplitude of the permutation F, we plotted the capacity (as the percentage of the total number of pixels) and the PSNR as functions of the amplitude of the permutation F. The results shown in Fig. 3 were obtained with groups of 2(2 pixels with the mask [1 1 1 1], and the discrimination function (1). If the message to be embedded is a random bitstream (for example, if the message is encrypted), the PSNR for the embedded images can be calculated using a simple formula (assuming a nonoverlapping embedding mask M=[A1, , An]) that closely matches our experiments
EMBED Equation.3 EMBED Equation.3 .
Fig. SEQ Fig. \n 3. CapacityamplitudedistortionFig. SEQ Fig. \n 4. Capacity vs. noise amplitude (
Capacity vs. noise: Fig. 4 shows how the capacity depends on the amount of noise added to the image. The x axis is the standard deviation ( of a white i.i.d. Gaussian noise added to the image and the y axis is the ratio Cap'(()/Cap'(0) between the capacity after adding noise with amplitude ( and the capacity for the original image without any added noise. The results correspond to the mask [4,4,4,4] with the discrimination function (1). The PSNR after message embedding was always in the range 39(40 dB. We note that the presence of noise decreases the capacity in a gradual rather than an abrupt way. Also, the capacity remains in hundreds of bits even for images that contain very visible noise.
6 Other Applications
Lossless authentication: As argued in [1], distortionfree authentication is not possible if we insist that all possible images, including "random" images, be authenticable. However, in the same paper it is argued that distortionfree techniques can be developed for typical images with spatial correlations.
Our distortionfree data embedding method can be used to build a distortionfree fragile authentication watermark in the following manner. We calculate the hash of the whole image and embed it in the image using our lossless embedding method. Because the hash is a short bitstring, this can be achieved using FLSB flipping permutation for most images. The distortion introduced with this method is very low, with PSNR often exceeding 60dB. A secret key is used to select a random walk over the pixel groups and also for encryption of the hash. The integrity verification method starts with extracting the hash and the compressed bitstream. The compressed bitstream is used to obtain the original image whose hash is then compared with the extracted hash. In case of a match, the image is deemed authentic, otherwise it is not.
Distortionfree embedding for JPG images: JPG images provide much less space for lossless embedding than raw images. The embedding must be performed in the transform domain rather than the spatial domain (decompressed domain). Because of quantization and typical characteristics of images, the distribution of middle and high frequency coefficients will already be biased. This bias can be used to efficiently compress the original status of the coefficients and make some space for a short message, such as a hash of the whole image. We have successfully tested distortionfree authentication of JPG files using the above idea. Additional details, further analysis, and extension to MPEG2 files are given in [3].
Distortionfree robust watermarking: A distortionfree robust watermark is a robust watermark that can be completely removed from the watermarked image if no distortion occurred to it. For such a watermark, there is no need to store both the original image and its watermarked version because the original can be obtained from the watermarked image. Storing only the watermarked version will also give an attacker less space to mount an attack in case he gets access to the computer system.
Spatial additive nonadaptive watermarking schemes in which the watermarked image Xw is obtained by adding the watermark pattern W(Key,Payload) to the original image X are almost invertible except for the loss of information due to truncation at the boundary of the dynamic range (i.e., at 0 and 255 for grayscale images). We propose to not modify those pixels that would over/underflow after watermark adding. Assuming we have Y(i,j) = X(i,j) + W(i,j) for every pixel (i,j):
X'w(i,j) = Y(i,j) if Y(i,j) ( [0,255],
X'w(i,j) = X(i,j) if Y(i,j) < 0,
X'w(i,j) = X(i,j) if Y(i,j) > 255.
In typical images, the set of such pixels will be relatively small and could be compressed efficiently in a lossless manner. This information along with watermark strength and other parameters used for watermark construction is then embedded in the watermarked image X'w using our distortionfree data embedding method.
If the resulting image is not modified, one can revert to the exact original image because after reading the watermark payload, we can generate the watermark pattern W and subtract it from all pixels except for those whose indices were recovered from the losslessly embedded data. Preliminary experiments are encouraging and indicate that this approach is, indeed, plausible. Further analysis of this idea will be the subject of our future research.
Steganalysis of LSB embedding: The estimated capacity Cap' can be used as a sensitive measure for detecting image modifications, such as those due to data hiding or steganography. In this paragraph, we describe an idea how to detect LSB embedding for grayscale images. LSB embedding in grayscale images is relatively hard to detect for a number of reasons. The method based on Pairs of Values and (2statistics as introduced by Westfield [9] becomes only reliable when either the data is embedded in consecutive pixels, or when the message is comparable to the image size (in the case of embedding along a random walk). Their method will not give reliable results even for secret messages of size 50% of the pixel number. The method [2] was designed for color images and relies on pairs of close colors. It becomes completely ineffective for grayscale images.
We note that even very noisy images with LSB planes that do not show any structure or regularity have a nonzero capacity Cap' in the LSB plane. However, images with randomized LSB planes have capacity practically equal to zero while their capacity in the second LSB plane decreases only a little. This suggests that we can design a sensitive criterion for detection of steganography in LSBs of grayscale images by calculating the ratio R between the capacity Cap1' for the FLSB flipping and the capacity Cap2' for the 2nd LSB flipping R = Cap'1/Cap'2. Our preliminary experiments indicate that messages randomly scattered in LSBs can be reliably detected even when the size of the secret message becomes 30% of the number of pixels. More detailed study will be the subject of our future research.
7 Conclusions and Future Directions
One common drawback of virtually all image data embedding methods is the fact that the original image is inevitably distorted by some small amount of noise due to data embedding itself. This distortion typically cannot be removed completely due to quantization, bitreplacement, or truncation at the grayscales 0 and 255. Although the distortion is often quite small, it may not be acceptable for medical imagery (for legal reasons) or for military images inspected under unusual viewing conditions (after filtering or extreme zoom). In this paper, we introduced a general approach for highcapacity data embedding that we call distortionfree (or lossless) in the sense that after the embedded information is extracted from the stegoimage, we can revert to the exact copy of the original image. The new method is a fragile highcapacity data embedding technique based on embedding message bits in groups of pixels based on their status. The status can be obtained using a flipping operation (a permutation of grayscales) and a discrimination (prediction) function. The flipping simulates an "invertible noise adding", while the discrimination function measures how the flipping influences the local smoothness of the flipped group. The original status of image groups is losslessly compressed and embedded together with the message in the image. At the receiving end, the message is read as well as the original compressed status of the image. The knowledge of the original status is then used to completely remove the distortion due to data embedding.
The method provides high capacities at very small and invertible distortion. It can be modified for data embedding in compressed image formats, such as JPG. Other applications of this scheme include secure invertible image authentication, distortionfree robust watermarking, and a new steganalytic technique for images.
Acknowledgements
The work on this paper was supported by Air Force Research Laboratory, Air Force Material Command, USAF, under the grant number F306020010521. The U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation there on. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies, either expressed or implied, of Air Force Research Laboratory, or the U. S. Government.
References
1. Fridrich, J., Goljan, M., Du, R.: Invertible Authentication. In: Proc. SPIE, Security and Watermarking of Multimedia Contents, San Jose, California January (2001)
2. Fridrich, J., Du, R., Long, M.: Steganalysis of LSB Encoding in Color Images. In: Proc. ICME 2000, New York City, New York, July (2000)
3. Fridrich, J., Goljan, M., Du, R.: Invertible Authentication Watermark for JPEG and MPEG Files. Submitted to the Special Session on Multimedia Security and Watermarking Applications, ITCC, Las Vegas, Nevada, April (2001)
4. Fridrich, J., Goljan, M., Du, R.: DistortionFree Data Embedding (Further Analysis). In preparation
5. Honsinger, C. W., Jones, P., Rabbani, M., Stoffel, J. C.: Lossless Recovery of an Original Image Containing Embedded Data. US Patent application, Docket No: 77102/E(D (1999)
6. Honsinger, C. W.: A Robust Data Hiding Technique Based on Convolution with a Randomized Phase Carrier. In: Proc. of PICS'00, Portland, Oregon, March (2000)
7. Macq, B.: Lossless Multiresolution Transform for Image Authenticating Watermarking. In: Proc. of EUSIPCO, Tampere, Finland, September (2000)
8. Sayood, K.: Introduction to Data Compression. Morgan Kaufmann Publishers, San Francisco, California (1996) 87(94
9. Westfield, A. and Pfitzmann, A.: Attacks on Steganographic Systems. In: Proc. 3rd Information Hiding Workshop, Dresden, Germany, September (1999) 61(75
PAGE 14
PAGE 15
*+<=RSZ[w
56VWoqtvxy._`defgjkL M N Q R S T U V X Y f g i j k m {  } j j6KH5CJOJQJOJQJH*Y*Zx ~noK L { %*+O,$*Zx ~noK L { %*+O,9s0367!7"7
8888888;==j>>>9?:??.AQAxAAA[BcFHLI,JJGJHJoKpKKKKLLMMPRRRRR:S;SeSSSSSSSSSSSSS
\ y$z$''''8333344I4J44444444444)5*5F5G5H5I5J5O5P5Q5R5S5T5i5j5n5o5p5u5v5w555666666667777j:{=
UVmH jU5OJQJ j6H*H* jKH5CJKHOJQJH* j j6MO,9s0367!7"7
8888888;==j>>>9?:??.AQAxA$777M7N7R7S7777777777777788888.8J8K8s8t8v8w8}8~8888888888888888888888888888888888888888888888888888888888 j>* jH*6 jU
jEHUZ899999.9h9i9k9l9m9r9s9t999999999::;;
<<<<<#<$</<0<<<<<<<<<==> >&>'>L>M>g>h>j>k>~>>>>>>>>>??6?7?@@@@@@"@#@$@<@=@>@?@@@A@B@C@D@G@
jEHUjG{=
UVmH jU j6H*H*6WG@K@@@@@@@1A2A?A@AAABATAUAbAdAeAfA{AAAAAAAAAAAAB
B>B?B@BABCBDBFBGBHBIBKBLBRBSBTBUBWBXBBBBBBBBB1C2CeCfCkClCDDDDDDDD.E/EEEFFFFFFFFYGZGkGlGGGGGHH j U j6H*6_xAAA[BcFHLI,JJGJHJoKpKKKKLLMMPRRRRR:S;S$$HHHHHHHHIIIIIIIIIIIIIIIIIIIIIJJJ.JAJBJCJDJEJOJPJrJsJJJJJJJ]K^K_K`KaKbKpKqKKKKKKKػؤj6EHUjy.=
OJQJUVmHjj6EHUj1==
OJQJUVmHj6U j6H*65mHj5U5 j U j"Ujy?=
OJQJUVmH=KKKKKKKOLPLULVLgLhLjLkLlLmLnLLLLLLMPMQMwMxMBNENNNNNOOOOWPZPvPyPPPBQCQ]Q^QQQQ}~IJ5?eэ2EYá*ۢz"$%&'(4567KLM
,
UtVuVvVXY)Y\_cffQgRgjl$$m$$ִ.
S
x0 cccccccccccc c"c#c$c&c'c(c,cc.c0c1c2c4c5c6c8c9c;ceeeeeeeZf[f]f_f`fafbfffffffffffgggggg g
gg
ggggggggggg g!g"g#g(g)g+g,gg.g0g1g6g7g9g:g;g=g>g?g j jH*6H*H*6^?gDgEgGgHgIgKgMghhhjjWkXkZk[k\k]k^k_k`kakbkckdkekkkkkkkAmBmCmFmGmHmKmLmMmPmQmRmnnnnnnNoOooooooooooooooooooooooppppppppp.pepfppp"q#q(q)qqqr jH* j6H*6^looqqrvQyhyzz}}}}4~5~6~ >}~IJ5$$l
$$$$rr s!sOsPssssssssssstttttt*t+t,tt/t0t2t3t4t5t7t8tttttttttuuuu~zzzzzzzz{u{v{>?\}]}_}`}a}f}g}h}}}}}}}}}}
jQEHUj =
OJQJUVmH jU6H*H*5 j j6H*6M}}}}}}}}}}}}}}}~~~~~~~2~3~5~6~H~~~~~~~YZ*+ tux}~JmÊۊ܊ſŻ6H*H* j6 js6 j=U5mHj5U5
j!UmH jU
jEHUjȣ=
OJQJUVmHJ
!"./01245?ABCDEFJKLMNOYZ[\]^eghijklpqrstuwxʏ͏!"#jmH* jc5 j6H*6^5?eэ2EYá*ۢz,ʔ˔Δ hiѢҢ:{ѤӤ!"$%&()/0234M0JmH0J
j0JU jNHH*6H*6H*0&'(456789:;<=>?@ABCDEFGHIJKLh&`#$LM.
00&P . A! " #$%M mnQ7o{cLlXPNG
IHDR\gAMA IDATxy5^ "G`At~#ڎ! J,vIB`<.7TwFiF+0H#_MzlDaGct3]4Lgߛ*sQQu3]4L=EQ#~3]4L=E3~[gU~6i3W@ct3]?#4H$y^nlۙl6
(ʲ1E4Mk3+2_81Q,oWgYNju=^q8̬i<~Ke(xo]ooo(,Kʖ~ogߛߪ1V+D+P_lvLyC&[B!wȫjB,~xX!A0Ec]wwwy}~_s4H(L.رHx=
W+eq$nu]u}Lu+R>δZ+c,˒EiH} {lmWxQg9.QZd2⑲,?u9nEd2Y,y
4H7Nj2r$vct5ݹ9 2i.K.{7f1KP qH#ԉDe uW;ŚC\N@Kjo!Lb~[[.5U}K[@l6ryF:=s++\upje0T
N5Fpl6jGe8Gڑѩm4M]YjK@@B^'i1X?.6fFnWh E$>h!`4؏G}{5(eX
!@5QzVj^90iqr_c:TJp8sc">8NۭNOnk/9=H5KZqvWUճc>}_~en%(ŮV\?EԀ o!7g ̥gjdfA'bUUV~X$3{Xca1>??W#x8lbu!ǅt٪fW
[3 fLk¢"Zbf[K"/^8 sĉ/l:ڏP
hU_(W Nw8ꂃ;ss^Jj%7+lmv#̧ؐVbU[Y֨;3YcbuN1Fb3;;X=1b l64xڳW}A2s0LÃ LĤZ.1FtrI
eY"!F_墙P)pL1Pm@#S9xg7K.@bCx;<2_̎@i3w03HjIIW\FY2F546ՠ$lIcJ@S$%Xa
(Agn"NC+jת4DmQg2YU#ٜY'=rUUAIצivr
lrQ߿Kz+*M,!mDD
K3ؔcRe`5+FqYXѢ4G;k!:0+ӄ43n@#p7`Ǖ
TVŏ c!b]ǎ`bpغW}E!(V NVkbiyJeݪX >ʗO xȝjv6
_,0sΚN0ϟz{_VF#RBut<36zS;fEyLbߟeܻfM̃
@)AZ,AB@ʽv9wvw~_>hvN\[2Cm1?`T\Z(/U6bIW(iH,]yn6s?v_[V`m t]9o6AmX9Ȕ֎ '^V!$R8Cg~h+sYLʡl^2K7V맧'87N2."B)p इj8^.*1f77ɗZ2!ϰc[@_UHiY:[[5∯3?ؓ,x2!)U10MŦ47oRĩP,g'/c,^4N8BX*Ds؈Kl3M͑o6o߾AAK;O"B7y1!KV%]d'd>QPf,wX
ifDapD:s:줶ݡZ,p:u0vc~M@NPIYa:Plu];Yo 9./
NqK'݄&k
:TCBp!=0ۥ0L~I`dO=`Jl6D[D(:Ag2a6rc6:;^__/u#=ɎrN`ۘ\!y1rPvsv.GA㻩Gդ/:>d#y~~.
w0XY.&x<
}8l:n~~=<<ꢘoooXbmk;Qg([+?v^ *l)@Tc=y]jN%Wtq]"\ 1eYaHq(֓pD#C ;.3js nWuG(o`%yL7(u]/!mHn٩zk֎!@pl{lrnꃊr,oWRPeYRJ5˟>}l6ooo;{f/bȭh$9> fo#ËөmtQp8TU՜&&I]T5ܮ9uqEݹ,/b)?ד4M^),~ rQuwhV?N=z(Gq>Cd:Ȑ 6l6>&=K2f(XDw9b+[ɏ :nSKruW
1!@±SSgeȥd s6p ;a]N9JWtOGi` ?F:Q3(Kw
ӈ<Atp!)㜬!^N4ykGhp%BQەwJ(
q_)$婪>EPQ0jk`3Yƅ]B21FL&www=3g#(=lb!Wk
,NEJo>ۡJR~R@)*ָ1:аr69"w)N}0C+R_ogԢ:^93I
VD7
Q;iDa;{h8uVV*f:G+PōI*2_vy>@W+b];:d!N#n4)Z`9ALH
F]H f
ߌM&j֮aKX0`uʈ5z45R䈙fjl"k
cXUUߟ`]WiH4mۭKC!g)"s'CvuRǜM;[_di$.Oޯqt"+'q&(lpEܫ6) Lr@^"KQ9[@{弇GJ=ml#g`K0G;IvV53jCSY7GC:6?$4yQ)%qbXHE2DSaE%lT8
Τ:IKXpm @ V1Kn/1
*bYgi.h䛒3*R!?UWޤ&%yya2Mӿ&lfHH;j?o*& i=nPHh!PEBf@wE `*hlf\@fVW&10(c@v.a[>V9\]/WzʹѕpcvwioXԛD}4wgh%В{v(j+X"q%@/Nڗ"R>F'yم~\Q
J(iBawe/E1S)2P4*7z3 +2r\,8%fp! d)n&Ib._^^g2dqpF0VʹG:nq帉>xJ`!(uJ4a6Or%賮Kl
adLUiɓ?eYMB5 PSl6[mN?&t&&%fzExhiFZBw]uBBDWW
YGmXwŖr\.Ch lHɄ$wZ* Nd%l
ׇ@\'s"NlP]PpΩP"rUt" B).˒%KdI߇PXjuO0xwwGRRoQ˥\)x<yIr40aM$$~܅pP+T4
IDATQ_J4=^̉V><`ru'#Pk\Ȼ]rʸZog'q37dUyKKuܲ71P`J+iieAl*`B;To?e^Cb2~\UUwwwzP+§MUuJ>ToVi5q^ Cpj>KZ
K=lj*%"Ev@e>ȭ:FO=v,rPIMA``7L lU:T f1!Deaݕe L= J#ѸOsuszh.Jv_픁ɡIm/pLUBuWqt<1OmHjY
5FX8x
hT1bL&ؕɸQ5*,E4*(v\>??#˄%N1$&ÒiC
=i )bk9.A$f[Ρc=r6^W*"p_ӣMf/.9]k9Fn <}m2@2Y,YmKxa4揌KGvhbS9,[
Sʠ 1&EkpuH(0t
+$mz2WEnuѪb=#x)lf@[(9:fZ<]JaX i"hU덧TeQuV"wԄ%0b
0}xfi!l"T#XiAS8SHb.3
C5)k8ے݊s4weW(P!^9G:_Q1ϟ?q"o13b>}:JF~7y:tT31=ĴhtMӨxHG V&:@Ǐ)3$e
dsB@6jX" 8X>}BMx@R[nT@lr^UhA>='Jmv?u]tׯ_cǉr
,qrc$p'un<fb33qU)Ipsf=Gp^7Xb&˼Un=L`_gA(aZ?vw~uք8ιsrbE JsA={koGæiPfjB4\tPz'(OR Ӡ,q1)*pLiOxe3s+h1ܤ&]yL:EpcnqTH3yul(E1=>`D}ŏW~ߧi/>rSQi.Z峋Txj$֬PؖkSxfت%]cbVLq5[!Tl0ݘWWBF׀:Q8FomFF298GUpHhʩHEZ3]}V5JQܸ DƔbct<pMciڗ.ǝû(YOiN%ZtnQ?UhR9V54Ղ$A:a,Q۠WA> 5TU$;ܵH.tjq)%5*1pd#(
+CaBj³~\IF=CIHN,$FB
Ė^_lhݨumuVk}r,R(+d()S+yszG7 }Zsԫ4IGx?P1
UHa3IWMɗbV}4s2Ɍ?ĥbSHiD*̓7O3֙P7j9Vzpz*zCI
!9GJ=>G$$~u)FK!89USʇFh*f85rp}LNC%KcgXqG3sF"Gɳzթڣ
)r@K)viutfPi \6D0aIDE￣iTqk0[d=D~?Riu.9TgjɵN&6r>P}:i=5o$%Y*\L>U!Xd/dl.
YN}^Q.j#lrww0ՔgGL擌I߈`ːo`4m8ܪWO[$qrHnxa7]:=U7Vxew\uĕK,Br"Tgl
*E=
)əuhn~j<9_KQ;,e@MRr5pa깩3Y(apErtl6GPIx =IBϥױ]]([Sc:ܫL2'Fhޜ@#"EB0"<Jb Ѥ:ڳe
=愿V%3mN ga#/_FQW3zvUmI3s\!y{^{D\#V}[W,~3)S9qlu/KX^5뮾Eч4]Blc8C87S8vLG8҉`[(d"겣Xd.JBtmMvC}QPIOu8=&iҜVJf17RԐԧG`pԸU۩R"E`jB+Ŏ(fju<2;,Pj%h]ׅ h:K'mN$;zY~PuyЭՀ,`Ks2x~nnd`W/wi%rZ"3'~w;@T44;D@up;vuB4 !TU4݀)(S=X,QjTԒQtea9C%X )[M!9b$VUQH>v QA
>uO;>hrK%o;`.t&0@/A
ږIP;Eݒc*Te:ڸuIo턾jT
sYϮ19\lΌ.39W[©d]2 F"Ԅc(%XYvm,2v!֠>
4TQ+3!ת
ߌqM><<<==1lK;ګMӀ)Qr[ruP^9a&\.f@?B^L
*9®' *n>''2+FNɒ9LlxZbpWKN9
f[G⹊ÕEwȫgrE+qBavӟif]g;5V$%cXПFr5b;ˠi4
<U<2dEQ5ܧRzEL:St8rf`Crƙ(3oХPMu;j\;t&2U^H5ϲ~ɠ6DQPkDVj6&V>:ofpK
4!u22ꅹ˙*ԒpHC\CTa'."h:sX&=6W
yg+nV'[$7*#[Ebs0Ĥ8S1pY#(Ñ0Mg\ʲ
1PIN;u˗_"(!$w!L㐎+AwoպLWSWl+OC,Tyh!فS+rs<J$D}ҥ]'jAga(gR,ىMdU!zɈcJ 矟>}Bc~4:n?W
gexroK2x_'o?%L)p(f>#.K$e[.h$0c?L7j=stQZU3ghi)r1F8pV5"od8qW܅+eMR K߉xW(E@UcjQ>dUTU"=݁4amSw8gRZ[WH*۩ӵMTa1l2t$ӧOl>???OӗqK+\kj3Vt?UL2ûpۋbӀ
xħ@
,duB0>ܰ~o&<;H^bi!nt<{N.a9v9+pA;u8;&NŶj]EAfө\0sCuͰ.ڊ
ȫ1P2!k(pqkɭ_MS/bkLa
v*pa))ff3LсI
YAb*QF/i
p\8߈:,?F TLCYϟɻ9^ԇP 2XpϟqrE[fQWwp_L!s21vMd!5)vΩ>`K:SrLYKSQAHj~ug". gsRgZ>wEWLsa
K>ZZ3H;N.q
=7˯Fѡu'O5˓D} fbBW;@{>NjZѺHZ,K>ӄ9b/ڹYFSw%KU[Ƚ
TiJG@0UiV8hP3aRTZ'ҼH1qRx#dX2wxa;\Qp=Jo2":ˌ)%=TUCȽř8v;ϖKs?q>G{kvK:eIl)eyv Z{z2DN9; ,䄫 lETFA~P'\kL'cDx3NoooKٓ4U)Ś0`&#r}:kuZA;p~2;b!JP:+V]V#!
:]\.kk/٩Nޔjp!VU
50WcIVE v >P\ohm]Bk&B!~5'*P9C T5f
Y.*qgESb'Ըn]~Xdt4Ιck7p뜳VgJs B _ͅ>FK>҅B)yUUo[#e839xcPH212]D2O3^`؟4W"pd~Jtb,g1&*(duKSغS[}!en?x,V8Ļtu`)+DךnT]Bڈ)
TtT=i){r^U]1FMGgdpƺTYA+cuK>{@<=AaFu; !dr@t>W'g\kZZ,ȱح?9Oɪ(
D$6j7HB1e;ctm;ur;a(\Үu{#Kr*z=iHA&nD$SQ',u%%1BE5m IDATU;s+
Hh@}IY~)&4cMt
yIv+~f&
_9wW{]77^TnVClKw+P鿊IO/gcnh>]5.,vp'7d;H
Y)g[a9=CJe81 gQ;AU`1ngc೪D+cvV4LsFP:\@c=tȨ$
NJ,lU5,h[dgY$:,;^Su]?<>>Nw1I'+k!N"V%f3+xc>.#iH#4:Q8W:uZE@=(xZ"LQ^DY?ִ)!L$dg)]$f幟!J]*.aц]źxbEI2H#}vtEhS2ROXhRkW{vl;LGM4OOTM߃6QĔ4l&J.Iv_!%hiN^oΏ0
wU\~j5?O,?ZF.GiF!Aᇇz)ݤr[4pI2AzQUo]Hmuk"oE7.F?nu^n{խӱ]unuz'鷳P֝]Yg+0w"#l=]z=Lwyzy*'U:nېuJ;٥4dtSzF9~~=yCl`G?[x]4 .߉g;wOE]Ӄ
_!%\. sv=bH{wWZw O.2=V7Wϭy[JV@hf}?a
zc#nvFϼS~:o=CΞ\ό#4H$7f{迳隣EJWƉ^~Y]h.{ƞ颱g]Ect3]4L=_ggh.{ƞɩu:#4H 7ZFi~QxF韤zŏ
Act3]4Lgɛ大53Bct3]4LgFѓƞ颱gh.Ҟ㻇POǞ{ƞ"iF'<ߔ~ZuYӧOfcfq٘Yw]a2f3NCY!~L&p8iBu]O~B!o1#~BUUfǵEeYu}<qjoV*MӼhv]AZ{җ/_~H,KSUU,]ٌ3*8ϸ:b늈1bQći~ehfEQ{ԧ֍(
UUUbC]fH1*P1xoP]!ngf%>^QɁL&u]O&*NG.aƃmm)
1F~
Dp(ѿ+_WUMC,K8:9͘t:WbM`) ܆W`bE(
&s_~Enq@mYw,2EQ'x%1ݝ?y̔W}?2ʁ*`b?qN_9#4L𗋨,K^4i03fuGS!JƔ!SQbRLx0Pb+ؑlP!MӀfGs`iw'Qu]EA]2@^a\Qb"7
!)W")^mIсqr4}\!cP/V%&h Go S"lJ+G(
& ;~_n1ZUmhf."'eوAeJs*#
_C`N⸒/6說b햏S1v
lSЈ)mYOn[
YaYb&SeYg*~0EQLS.ָ{}TQ )t8ʲhf'Ʉ2G©s`sݒY%(wu#
_L0N!B/肇ɩdn9M&zQ*Mj'.גYj8No:!S9P4wrRk*y"ϸZ$7`LSHsH&,pH<#
DxN_QQ
gRYCqZ6Zul:b?l6x{JdYkrǚ[14Iy%
w.Z9#SL3cHU@@~,KV1ݺ 2DwEb=3}ϚvK&AY4`fwL& _/z7t@.NS4Z@z ةP"⑇?arg =>>"5&˖sr6D̟+(F+NQr1UUbJ.:䂜nXe 1~"P>ll6",7%8 f%h>LjoBE!ဖZRC+PRS+p\$/lY_ iLB0Gz[2+ϋ(ݰu#
_C͆<&/6'aXNNdæi^`F0f3̤vF" tAPA@!*9Lǖ+ŬDDV5xovKɃaIU_R?0j{(8Su'PrL#AnN$w0UP]92IS!XۚmD+Z)qɩUz=ρKl[_tTNhfϖ0Q`7
jHLWVdB8K!P@Csd9##>o6(\":Gbw! ~d9U+˒RtY`X+o6vNVsBlףբs.{G!Ґz<9\xK3T]}A$s824~4V50/vBt!*9Nqp8TUS!c})@?
RłN`"j&~uR9upۿMTX3n?քçԣN&
.:^DS{ϫS'
繙xxﯥnw%G
@"\M8bX0q;;u)
`\0W^`Z946!NR0>4a96pB6;إb'd99Hݶ!f`zzzr@;r
ɹ_edB+֬ĔCR27HamՇ<͍W$K/=2@E]0R{h(/ܥp
^SWuEd;
B."Bh&ZbHC2v{ #.bPl^CI
M몪
+Wn%$ƤpX,
$J"`"2eń$!;ڱ7$,UXV\yd,hKQ`xZR v;W?<<*DAˉq3{{{{zzb~A G:pMqjзcYGs kbiy˲D`bs7'eWɄ2IX_1Ў&]/2̪z}}S#s*1)CPޤUV#?G32%$ru::,bX,C)5{,c>3qpĿ<_EoB!9}+[So07HxڞdЍ$Jf[Qd,"9zR#$V37z+73SMwP'
wA=2>ZN!ĤalmE93)H)B#կi>
WnANO!9ӕN46Zb(WԾUU5[.?p8<<<@)4
brbL]%') <DZժ?buC(#?u!sJ1FBHWDqRR^m Fك9xf6h@#_^^L0Hyw6F,nd0Y47ЮJdx`&f5I tTܫ1+bjS̘l\L,ZO{_܍#`Ps:%6l&T\^un!#nCu!7!n'19PC
HDيAl/IPj:x\B942+CqvHQBs]] y>
^Y(^
T酇r:*j~vunD=BDW\TXop_SUx *VqIW4Jj$JiPKz裛PZdV\;AqN2yqA9Vzr@L!@.x>9uA)'㣎/a.Hg3!F尒Á^V4A7KH0Lxj~S{ CMЀi\.7
۹3Μw6/D!/
^(l ooo!*i@LPL5ƵN:+_LU.eGSXm[D90°ke6h֏ KY{ (bfU
SU}߫+tH:;ufA̪+G})tu;%C"u}CRI3Y,ֹdvy_eZn(vֿƣKe:9Y IDAT'C' LB
:H%7O\tkuNEZur[.\*T8+S7zm2GE@dɋgdW9ib584!6weݒcWp./ jz
GےkZC4pT{;nI:C24n)^֛NSaU,L*ۜt2sv8q)FD1FhiiIat%(u,dW Đ0K"IMNY߶pm>z!P/m?ԞbkCZ&1
j<_8c>k'$6gCS+T
"&7lfIȵ5;>xxP=nÁ?E0GI,:KbpO7!TQx]R)
m/rbӧz@:r*N+9 7Tfgl*d29u}V{r%8!/4蔿N=xD
a~_evK!fcٌIY7ggCa*".&z ;uôF!Xӡa"L?0BA0WNLQf\nHPV2]T!iFcD0jZQEw
x8aJܦ4` \)Rl}y@)$]u
?Òc(
iRXH&zaS`=W,ӡrUbe}Y^0u'RrrI !P&";jRKO}D=JX
gg}w5T'00Җ,ܮVGV
:NWLy<ցCAvd2a{HuԙjEI?H/E&$2.b
nB?[wtYR{JqՅփ%UyZH1i/[{ͽbu߲:I* =e)D(A7Nk=Mί]n..º.\dPuH0+s Q;*FORWڻi`r1et]I1, L/};u³7UR"ZJ=O!@L8=w/ ]34]TP;Lk vr>nH6f@`;&b^s;~?)H@F$NԊŷܲM5 NkݗMRCƾɮs>%ݪJbO7*͖\ofmntQirXHUTgqzp3؍1~g:$J!Wys
6!SRU*,*
nE1![&J딯b7DA\˹q'G̞)$
V)lI2N?}D3N;37F{8H`bdrÙb@Xn>}(GqFS;
Jm:L6EYI!!HQޜ(/!+63J4KFؤ#qTq78HY+
g]&[5VJmf\qJ萘vC
Ih^%:bxSLz=Unmi
=??c߂!7sQBC& Nr .SȆa9lEJg`qYij?52EJ2jceftץooowO+"_Okϖ1kW7Q^u3M8):s<9KQ0p1H1jt+s
XpbRВbғuqNAY+ly3uf~%&r
Ȝ)(7`y}f郁RuG#@gyj
w^wQ`_;]ATK7g";\RjިN̝ޒE(!(Jq(0FBW]kY!=g~)6!^ʋIZClR!T&cIHg˃^
tAv7=M&
Ή%yDW /p5KVUEE["" bfi^C.(bR4PDܤ1Fodm;
yEL&M~}r:>==I&bo@F?eNnVt!_ \pduz]y
hP=7t
+*0T}?Ҍ
\y
<9QɐP27$}`G,1!HS,b6[Wl!⁎\)R̵b`/.
W.$cC^H1 )l8p"/&<(`Wr};reVj^_^^AmX+L#/CfXܹENg?9ħ.<=vд$z<L4;êʣ#VG^\iԈ[;.S,w(V/_kݱ7,+
^Qb9,;\uQ;;4iUtu[H~oh9[z˯CQ.2гUJ
c@cq8k=Bo1Ll!SWƩNkOܡhLm6ϟ?/?b`@
nok(r=?(!6(sjx^O&z(xl
1ӈxZ553@nooɴp@2 iCiFf%!^Z.HDCFB57hT9F~UA{Q^+zFc(14WM@@hEC5"Z~8#%
>+@@H8=J}`FM{c$,K]LpO(رn^
&xMue\7r:*:̸>6pwwc)jG^[}h٧_'z=xtAwll={+v͍Ele<
p}l>CDpņ!XTVpkd[39DF&>Tuo4ﲍP7Ӹj7~}F̲,xE(g2V
XbEdTh'CVt0oXе_oBT%/U{{ƍ].XdKw1g6XMfL h84h4JH]¡t/7힃 "&
V+5"
CgwGe\Jga2Y(
?i D8Ğ!:~pJ
YƇ@cBo&C0#2}K9E
E=rC9⯪Pe渟+oVEFI~SM֥rm4ICոm$HFU Y<:{wXCX!*H?P+З/_Aa8+ԋƷ}zz\ y <%"nk>G `]gOOO:܌
n
:wmC8<%`fLܹps!Ww;=.&C=݅[=݉~g\Wp]Uc$^=j8My&u(.8֜ T{Drܗ@1,tVp)Ϻ[=aWO[f'n<2ufG0(hLQ}*0:VGӱ:tڭib:Ou4CzB֏Lň8:.03c}0Cz\i\I'l}ޏanbWV#±kEaU<92nB
73eoa"NOig=L[V6OKc3⒬yyu#YמH([l0*I$̭lcQȺBr^^^nnnRc0G*nq\uKL)RMСc6j)ʯvYtׯ_ooohah4NbGsKXƒ* `@hlZ,%9Z*jhOVn{)5Ӷz0
ɛzCWc.s1#P(e*,c$=GU;⏝#AsZy21YgT2vv`l`O3_$kɖl{QL
FP\./9ʲD
41ȋBNÂ?8
q~H`u;+v:ZQPm*(Xwuy:wv"ucLĒ7bo,ߌtXNG*C~17B]Is5u
n2G9{GqhiڀX!KN!%W~&TLh7`:,8`TXQ3g)4+,Ud
[nSJ,}snO/pN_B{yy3=1aLsh){;aZ>8!5;wT'ŕ#$p8 Rz.8vgf('jbba߾}sc&_抋n.%B
Me3[6(BJf4}g0[P=4/Fw'a'
z*1K`08K*OҠDgH*oHAR0'C(al6ƷʆGVwOq5FիdNu0eQUUAz14LduS.> qZQF>s raZ(^;gMFцmpOyMNz=y>&+Q[ D2t;Hx7%L;?Rp&8Vw^P RhحpҀwa7+jO2]2z
RD">m6AǤ\#u%Kxd2Fqfs3p8\װ]__#.zXFh:W,4"///vduna 'L"V0h,du!InH:>jmۆŪL7r]IBDte3z3B1Ral5كa2)[E4eML!ꃼx7a6)˿^'Ct]sv7p?<g}d;7:I'(je[Idj!c
@e*r9H;By=A+AU'`L&x̍1q~aTЗjgj[&ydyGV _:v`;jpy&hyoooR
L,zG Nj:IDATtF&{>qCY?YV3NЈdܪd68
w+r\BNf6
2Y覹!ՌPNMq)
A3l`Iڕh4KC a$m,b
Lp~,fO=%`!n&f:M2X*�֨I}Tz0fiR)j
VS^)8!\!Mݙez%
X۠ stC
rFXs"{lƔC Lc} 9՞!=@d(E=i
vjo~ ,t̫J=;*T b%ꑟXJqWU5" ybz j#[{j ymos \:dmtati�;h*tl^'vpllzq!h(]5cwp u4zx1$kblu*[w+l^t1bl@uѢx[v;1`{ lxeq htss7< )yrʲ.+)Ρuuxsvnye8ԩcd<_ rcaπus:rbu:qa56x)ur{ e*}yu�?�nft{dc@h$h kb`�qm7[i,!av'z 5#0qc433> IQmqBty7mfH kW3[.
Z4Э~^#]s)\pVp85_6ЖXQ;S
&'@?)'}]ɑ=Q^
zB8֍%=B?OR%mslCiUSbBۚIǰ'㉨]x!r0BA\֨%eP=9+kyv]%3qMrsװؠJ+!3RjgPa_ѧh۸f`џz0 %z@!s@[g
3vt2NrarźXϓE\.oH$ZP$`]rn=wjENё܉&qfPڻIO4?if
~FMV<OHr
n$}v4UZ~l.Ki۔@a+х]+s`qgnf,V
V,先M\ȕ7Y65rMƈΐ坔tt17 !k:N` 9ϱpAFÜ͝FS
FBq*F{<GKx(b:}t't+ dTuUeV? :DN&cI&HJ
֩:L!z= ?,sZ^Ͽe %pp۞hXA
#~ObBԖl
+kY~vd}n\qT7Vny[t:!(up_>9$6U;hiEC۸h+cTL`1!&TxqAF[G$?S7nh=oAαV+n ]rPOS9ܿ*Iz(IM
*=j/P?JZÊ71G˭ rjg`do(0܄ZQ6#HgYR%UU@M<у`2Щi8~nX,#@;@h.ӫ֊ª]D%Yf7EQj/ &.N6W_
=pKI%k^"]
yʺIpPTFa4t>>3n[]b\q\m"2{_Q@ݝd+<"!bnho:WAܗAc<
Ǵ~rcl6<k֞?ϸs\VU9psssssr+9G"uLNʹh`nz^z~=Bm9oNt0ʷZ:{
$5Wկ
?RYQ[`ț2l/㥟C@Es(fX,d0DC#8ϧ)OnQU
[{saЂf5)WUy9NyɄZ[>'YL3 6L)cI)38;3]P`QdnO$qb攰˰J(77~E_[BY$*Ah=`H+XX˲ā=څGu%ewgfc^Bw &e`շN#>XcIHn27KLLezԓ\Fij.dAqpmnwhI?QJ5D75jj_h,V%8&V&EFH)if Z3c\bC:Z&ҘY&m8!$wn[D
y3i,G8xDF~GSfbAK1"iw,yhrBBNWV.($ԭ$Br'6+=7!vsdX@3{v`s
4IR$P}.a=AXiNN8Z+qVfw;T"WWW]ۨ;8Z\KB>#DQ/ )1F"$M2(PJLthjǣ:y&:Q!x碞,K:ÓiR/z$q4+Q+ֆGYʡy}#OhKBJUxzQV,DQfIT+ǝ/?9E4N
WPtZCtZpChoILc<3j6Y9^~0a?m˅8Fm/X b补jQ6pHpS=L(tu#oNۍ7:Uc;h۶GYmM/ҭvc)كPVx1xLS0Ӵes,:ƸE>XT5_"wPӜ9JKTdQ1LɅuoƐ>0zuLVIwn/)mh%ApfIhC*cęc>AaVșK\v<&8dP2&y`t+PL%g??Q\d\P`QrֹCa#g[W#Vw:;MsΏ5`Je@m~8*Rf.
+4OIYD3M⧃ItSt!usH'aՎ~v\y^rǘAbf! V ʲbӑ`T͉j;MW枳z%ȡМ9$9\pQ.ɳӉ0j\ENx9Hf $R@փfTYtY8P!&n!X7~23C>X肖dZмqN5xRfL4DIL[ǭAw`]joB<΄g;G?I3uĤ72?GO6#br1j!Q:,=+J9=Gn
EO6&lp,ĪKX
,d06~p)Fna*]J\iQajʲ̕ʝe"ҬdKj7>?ܓNO۷17>_,)n]%\49tYEdkw."VkE8v>X# '&{"~#?'mMTKx?jNM뼳͡E.rL9E:)P[%/O~?_qV/SsoUgǯ08̮9V椗/$[K[wHYY/(״̲6;7s+,h7֮4l2(vI^
ֲm{*g4Q[,!!i7kRcA
Ue_5΅:>WNvΐ!Zښ=5R
TMچY[0MK$2ЏCkwĨ%W]#9#
7:8y܋PhI.?Zεa8.#"1ЭV]Q%w~CۣCIn_7o>QcR@WQ$+$
'/S@cjxaH
iqj\2MLa~UrmϨtexƜPpGd*sR`U鏩DwbuvAcQ
K{_ok4{&&S:T~za1념WvJz>L/9keHv$Q4Mǃc}Πj_ُ:MI05h?>ǜl<"' ݇Eޱ߇
mA6Щ}XᙎƂ<C Y3<1Lbf٭Tmep;}DdlB
SA?2Y\hSƻhE[=n`!Y\hSƻhE[=l
wxR=KA0h
EQ,l$M(h*?!}baeq(0jR4RGĝHQ
;buDpO5w:".o\ޢouSՓ,./b`UQR=h]]cYQѻz4zf\RJUnfݩ Q왝IL/*Oו%w ]xSHDdB
SA?2~YYX3Fk.fn`!~YYX3Fk."8=} BX&TxZ
pU{vϽOyZDԊbS%(h(FZOI/!5
$@mtZEDǐȟ~!Qd4:7ｼ// $ss=s4'IӢ'5ӟd[MA?3ziOfsr_w [Z,GGRGi1iƴy,7ް_1F;ѕ2ΧshiVKw=*ܴ)dmZ&3%'6M`&hkmX\P"YAL(XiYyygf9wݧ%j HZoG u;!3Y$6a^B#gtK0Asٴyֆ0 #5cل]KhB}чeE^6Xd(٠V(b/7,
6>~H??wL<0f`qĐd6=<Ȏa!um
jaF7RϏpH 9
L=%{Fzk{@>a^{!o1N־xga5Բ9l?>6E&!N0!p9%A#KclsS$.aܐ(aP)$I~ ^:*Л@wbc2'iG{f<3A"=1gsxJ7>#_g&G(u̇A2%D6^F~1uΛL0_~jه.Mc
Ĉ:~l?&VM`/φ}.yP_81($.@0d)XgQxx Zm큜BMPzaeuĉX!Vl'cxm#L<]
sݰ
a·Ŀ:,X^_>^t7sH\$yOcWz=RcL]N5;ّymԳe#3OI78ޏwh_(XkbmcV~lXЌF7Un2 Vi4"qA;72
`Q8`,zbXnJJ̅Pn·5f!3`&Ɂ`6`clCĬg+:ܬeҟ
kڨYicRjl>w2ڄaݶZ;'6k_G6;Q]UNvl]'6yEj̀ʙ7Yf(f2N6ᣈgQHM(̐Ng\Q)4jXXsI?TcQ!SsRs3Ϭh,ZTho.">u&ec[CjcS"Tt)upS6>6q'svtL u>c8ti$EBI8z}h%_?]Bȫ>xmTcXkN}bg8JQcN5fJ"R&6Rq/
B;j%foĞjb^b>bU
Ob,瓱䴘ߒYLb" xYë"$d*bhEV4
qNoE8M".iF2x /܂s؇Fڨbm
!"#$%&'()*+,./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{}~&
Root EntrymmaryInformation0 F9HDataObj
{QWordDocument/#n#nObjectPool2@)H_1031486160F@)@)Ole
CompObjfObjInfo
#&'()*014789;<=>?@ACDEFGHIJKLN
FMicrosoft Equation 3.0DS EquationEquation.39qlIwI
f(x1
,x2
,...,xn
)=xi+1
"xi
i=1n"1
"Equation Native _1031489292F p: p:Ole
CompObj
f
FMicrosoft Equation 3.0DS EquationEquation.39qtlIwI
A=1Px"F(x)x"P
"
FMicrosoft Word Picture
MSWorObjInfo
Equation Native _1033977721 % FC9K1TableL
[$@$NormalmH <A@<Default Paragraph FontPPHeading1$$$*$
@5CJOJQJLLHeading2$$$*$
@5OJQJ#*4=ENYcnt +7E
35679:<=?@BCEFKLQSU#*4=ENYcnt +7EH
!" Lljk,2$cdX9BND,)@~9ij9( Т0{O0jj]Xe
<
#GH
CFr
6E
H
CDr
6C
H
CBr
6A
H
C@r
6?
r
6>
r
6=
BXCDEF%%7,"
",7
",7 ,6@GNTVXX7V,T"NG@
6, 7LP@`<r
6;
r
6:
B!CDEF%%8,"
",8
",8 !!8 ,"
8LP@`9r
6 8
r
6
7
BCDEF%%8"
"8
"88"
8LP@`6r
65
r
64
BRCDEF%%8,"
"8
",8%0:AHNPRR8PN"HA:
0%8LP@`3r
6
2
r
61
H
C0r
6/
r
6.
N@ c$"$#
HB
#$"$#
rBPCNDEF)NP
@`c$#$#N@ l"3 #
",HB
# l"
#
!
rBPCNDEF)NP
@`#3 #N@ c"#
%+HB
#
#"#
$
rBPCNDEF)NP
@`c##N@ .$0$
(*HB
&
#.$0$
'
rBOCODEFOO'O
@`0$0$N@ *$i,$
+)HB
)
#*$,$
*
rBNCPDEFPN'P
@`,$i,$N@ ;!$"$
.(HB
,
#;!$"$

rBNCPDEFPN'P
@`"$"$N@ $$
1'HB
/
#$$
0
rBNCPDEFPN'P
@`$$H
2
C&x
3
<3%
H
4
C$x
5
<5#
x
6
<6"
x
7
<7!
H
8
C x
9
<9
x
:
<:
;
BQCDEF%%7,"
",7
",7%/9AGNPQQ7P,N"GA9
/%7LP@`x
<
<<
x
=
<=
>
BCDEF%%8,"
",7
",87,"
8LP@`x
?
<?
x
@
<@
A
BqCDEF%%8,"
",7
",89DOY`gmoqq7o,m"g`Y
OD98LP@`x
B
<B
x
C
<C
D
BRCDEF%%8,"
",7
",8%0:AHNPRR7P,N"HA:
0%8LP@`x
E
<E
x
F
<F
N@ (,3)
IHB
G
#),
)w
H
rBPCNDEF)NP
@`(u3)H
J
Cx
K
<K
x
L
< L
N@ c$)$*
O
HB
M
#$)$j*
N
rBPCNDEF)NP
@`c$g*$*<
P
#x
Q
<!Q
!<
R
#
x
S
<"S
"H
T
Cx
U
<#U
#N@ I)3 *
XHB
V
# I)
i*
W
rBPCNDEF)NP
@`f*3 *N@ %~+S'+
[HB
Y
#%+'+
Z
rBNCODEFON'O
@`'~+S'+N@ *~+l,+
^HB
\
#*+ ,+
]
rBOCODEFOO'O
@`,~+l,+N@ .~+0+
aHB
_
#.+0+
`
rBOCODEFOO'O
@`0~+0+N@ ++
dHB
b
#++
c
rBNCPDEFPN'P
@`++N@ ;!~+"+
gHB
e
#;!+"+
f
rBNCODEFON'O
@`"~+"+N@ &$'$
jHB
h
#&$A'$
i
rBNCPDEFPN'P
@`?'$'$B
S ?
!"#$%&'()*+,./0123456789:;<=>?@ABCDEFGj!
\tgX 6td8ta6t^6t[p
6tX PtUS tT tS?tR_tQtP_7tO
.
tL [ 8
tK tJ= $
tIP,tFtE[8tDtC
tB
["8tAq
t@Z %t?t \9t> t=Xrt<\[k8t;Qt:Pt9FN+t8jt7>t6L)t5>L)t4t3
"t2D
t1\t.X \t+
]t(Zt%At"PAt
N
At _t t= Jt, t7z`tBt
, t
`t
Btr *=t _t A!t[u
t`oatDXt
FttQtFtQt
At 3tCt>ttRS/tttIEKI@I@]@GTimes New Roman5Symbol3&ArialSTimesTimes New Roman"phJFJf!~0
Jiri Fridrich
Jiri FridrichCompObjhObjInfoObjectPoolCCWordDocumentdDocWord.Picture.89q
FMicrosoft Equation 3.0DS EquationEquation.39qmT<IdI
Cap=NR
+NS
"C
! "#$%GH'()*+,./0123456789:;<=>?@ABCDEFIKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyzY bjbjWW ==IF]2222JJJzzzzzzI
IIIIIKIKIKIKIKIKI$JL^oIJI@tIIIoIII229IIIIIII2JII^l2222IIIIIIIIIlJII@N>zz9III
Message bits
Image
Optional Key
Stego

Image
Extract
Groups
Compute
RSvector
Compress
RSvector
Flip
Groups
Flipping
Function
Message bits
Stego

Image
Original
Image
Extract
Groups
Compute
RSvector
Decompress
RSvector
Unflip
Groups
Flipping
Function
Embedding
Extraction
Optional Key
HIUW\^jlqstv{} !#,.8:CEKMSU]_girst~55B*CJhnH B*CJhnH
jUmHJIVW]^klrsuv}'(HIVW]^klrsuv} "#.9:DELMTU^_histK "#.9:DELLMTU^_histN N!"#$%SummaryInformation(DocumentSummaryInformation8_1033977137 F)e)eOle
Oh+'0d
,
8DLT\ssJiri FridrichoiriNormal.dotcJiri Fridricho2riMicrosoft Word 8.0@F#@yW>@8={>՜.+,D՜.+,,hp
jTitle 6>
_PID_GUIDAN{5B9A3428AA5D11D4BD22006008D07F7E}CompObjfObjInfoEquation Native p_1033973369FuuOle
CompObjfObjInfoEquation Native
FMicrosoft Equation 3.0DS EquationEquation.39qmwILI
"NR
logNR
NR
+NS
()"NS
logNS
NR
+NS
()
FMicrosoft Equation 3.0DS EquationEquation.39qm4InI
Cap'=NR
+NS
+NR
logNR
NR
+N_1033977152"F``Ole
!CompObj!#"fObjInfo$$Equation Native %P_1037437434*'F@@Ole
+CompObj&(,fS
()+NS
logNS
NR
+NS
().cHͭƤ A:'Q4EMƌVΣTƤ
+G 犇\](ٞY=M,~"!euq2ǵP>i'GV}MxEox
G}7+5#g\=%k$Six=rt(Cygj,?`ҹ!k%5]@Bh(@h(#V?.(;X1Mm&"(Oɦ1 ~wr+5؇LFfg2jM1oaXQb),1\~sYb)EO(R0\!$C.ɾI@;`_E@!j{9bQ$}5@1ac=诩a50Xed==x];C+Εd*3>GI6`4%Iri0,I_ݷyiiKFeFOOtV<*1<LS&{Iny,/Z{'Vf[gIڵ_KSaOSf;lP`jh Ήy?E`ZyVDd
B
SA?207/Zn`!7/ZT L"dxJQILFDWBK&*DP"b,,{"ULaazn{vςovٟa@
g'<ADf$Uݎ.nͲh"?膝{#eUbm^_9"quhsss~[DFuT?IvZXtxɇשO5"]믺8hu*Wf5̵s:LSɁ?Ɂo?G9lS.ґ\W`QMMOnpM!eO
?$ᶰ$(ى!9oE:G=J75a>#0Dڬ?O! S?K$GÿgRUtfKm%V4M6$='a脨Dd8B
SA?2gJی$ņ@OZCn`!;Jی$ņ@OZ p/d xMKAǟy&yQҨ!(AěOЃo^BP5BM7xQ$"^zE<\Vt}]fw~>
D !"Bв,ɒ䐨ںv`dV6_(ӨvVkk%~TkH8Nd=h}uu]% l]"Ɓ`/Jfɗ*T=<9+nPɿ~8A%G[3n/{BsJvIvMvtnPɮ?ɮ?j:u>1rd,{>Ȼ@\9u^0\Ǔmzݿ7&[/I/oJ.Ý^![q]:%l]6]\(3"*N3MЩ8yPVLX'yo3
Dd B
SA?2U+b$FA
cn`!U+b$FA
c``PxROKAVMwMxi PQTjR.}>@:t>AumevoA@b 01Qd2}ߧ&KQLU8ű%x$!
G7XI^&z4Bd'C+쾂l"ZQ.!W^3Ҍ`Vf*7OJǭ!09ZϴsCQS2}LGE9EƋ NE
!NRz"@syd+\t`f(R5~A2d<(U6M}'Dd<
CA bay*+^k3K\!nnTay*+^k3KPNG
IHDR\gAMAIDATxqƟ˰zHʸp%X3WIorx@ H$~$'YdZ]$0>9Xkq_`@o6a;qHAAݴ@ O@@ӄ\B'b[ 8xn0d*;_bA8O+4AcZmsVJB0wa/0n3km%&doT%nqٳ*G&8HoފI2;1[1uIh'6}t,#m5&#Fc B(I]$e@HB7V2?YL_7QreiMFz
ED}
w7Ax#uѻ ض PwBWv/Քg
^C?ݕϩ{0;ԚIg \l(2P=#`CqqkddavV4o
l(V¬t+;\#w("A\1PO/WOs~j`͂Fr'aYD?toa̩zT'ZLDkl~x~Gu^+
}ɺe
;8bIw/#p<$>d=[+!斛u\_瞰Ggu ;P\l6MbL. _v&4߸CzDluy,zcn>Mt+?='mD0BC6h?rM[H2bg(Vz!1>PRT}Dn46Ux~5a(} 0pK$*&nD+SiOВw,#k܄2k'
!`i2Ƶ11IS1WtjņFzːvײ@UUcscxuS&y}
s8d(p[
?\
ZlCކMS䪈fu3l(X&wքF&Uu&ko
lFxˆH'{vj҄Fvztlk/(cwUbUֵ"]qc+&L}'%љb2,2Zy2=4̬<1O̅p]V̩">?I&ksvN({']w[T=g4۵mw6ʃegCGGſ& BFM ,4`#Dֿp[\}'aA^~^ȃY,9G35F[M誓2N3$yT;Moڰ;3KP ,\;dc.ы
bqD퇸k%` LѾxybgE1sO&o<խU#6Z9,/oT5י4Pgy̕RzUiZt@u}X4YO~5Bw
w)o,zߥznH>iʒrwb1S9*1$gzG;AQ+e6vDmLkSgsXf.([R[}~bXx00G7PI[V9}e5a
fG54=8·5@qǥgFgE7PʡVƛ4g䶃7_
_ٝ?Zz*%@Fi˜V
=5V~L}
ECl;
ve7wzjqGxrm ~;{{ےHފطrE.ܻlLVL MǄiz)@m9nƘ9Bßk50]"%9Y6ck^GА4)}uR{fݷ'wIr>;zrSC9+}a9]$QNuS8̌:H}{5ޣ>۵r*?hfIc^
Ⱦs
{RHn]pwtȅ
E!=gENB`G40eۚު *3hܑǆsۈOuۦXg%f,ndkelfKaoձ'[lL{k$v[ػʺ&Jt$^^y&I؆_}^_KV"`eWksCZG.}T;!dXlf}MŢ(wfpPk{JQ_TS`Qmqۊ=P(4
jEQAIScQ7$kQ\'MFH]jEzG}E0 w=pHŘ_608?V(jЏx,mBWx=>9Q
Ӯ&p<]aMpJ3y "gMkWpSr<$};Or(HJKk8&zÉYG
!삵:&dOm_F]$*ǐ{\r,餇r,}+S5ݘ
Zm*RA5wDuR˯UA8cJ4J1jRͲ}Vv$50^&!+FjhG%%Sq9M_W&PY3pAMcMQtm"\TDѿ pUM`ґwiA='&^dUٙXe" Ӟ.ފ@wחeL6].bǞ;Aa)^jp[ŕ *le[4!`g^5.]U*ɓ
6yzo
[[`YcsFL¹XFhtbO:hJg#XII>FoyrYC)>65ZöSt+4A+U84z`Q!)TQ/#Ek?.=o(<㐠SPbpZz=0?{+0E3!{6.Ϧ5+ ᮉocq2wv]`e]
ü:~c5h)yc0;5
c
A MDP+@$zDqQ?RgL(l$dH8T^tNVc얳,~hk~0PsX2tvcђFd
8'o; "Kb DW$y`WDN 'PwVA%:/bH;%U1uRyV{M"coH9DD!3ϩzMCeEDIdj}=*P5˵H"W:HU}Icז?.
::E$oRtp{qn;yC? gs5ۛDEEˋQTtWPApI_?"jFYHH>mnFLK]W)q8VԲc²X iwdD:E$zt>ĳƢ I2E$^%Kb&T+PH7H$P9T&%#ܨ~4Eg8Z $Q2^bbDv;qF1Zтzȭ5Q3dPS*s*b_O%;=@a#Q;s(j":?5,[%t<ׯg,*b5U}9pZyLjCb'&"6\Wt;N#icMc#F(0JLR&#hN{x"TG}QPF"s:0z_jF3a\w1szP8ܡ&$qvak!#FHAx"JaԱ(ED$F
E"LJ1;# szgiwJ"ZYpIQEn8E;! q0#e, 8b"j"D`Ob[m>D6ۉV[92 !T؛^9,EeM<;Cx(5'qw*ڊ'v3UOl;+F&^:{i\.RRj[{NǦ$֜Us`qy4A K}$2W~Χ;(Z4Dk `[4s9Dиwنx7ԆLɷw Иx%Qw!9&&ψ}F1 j״t.*j3RÂ;D[W)ZN:5ghN7t")j1'Nv9]YUr3&]Hǚ0Qt[#P/ױ$;ߧX$"/sFZ'/"9퐛hBOgsAЀy4AuNkrP4KrcW?{4b'ٕS[kcHf[ kqaM0. ƅ5?D6{IENDB`DdyMr
cNA*bpp_vs_noise1_bw.gifb2l;lp=nnyDf6}G2l;lpPNG
IHDRyT/sBITO0PLTEϿ3bKGDH9IDATx흉*FIR<j ߩr
3Ҩh]P8}zo%OݞZbľUD)Nj[V 't/$o,\
WB&Ӂ3ʖ)ھe+{W,y֮h^x{p#kt5q,?g4c)gs_;n=Qrhxʖ%wXn:7m5[~q^^zܒ_bbCi%/KxnNc؎MZ@4HwY٩HqIKؾ@]w$;ǲwJWvlg?
7yGGwrw
8R2M;>ۺ=*ybV[
l#E5gelQ(
gE*p],$rW=T32qfRS5?Vt_tu<w.vJ9Mm4֢;TGh;cfcVq@nIEK5_r#*9Fc8X7FDI 'CN>"8+Tג9^cMrOq3Ѝ5n297NJݸ`:dHc@ eێȊD=Zınzb7x ,cUi ǇY=cUcEK[OY}x:HNal˧7LnOYG?tO.Fc}Jkǧbc14TW;ǂ4dx
dP˱uq$ep1WX
}>N[M`=tQWm:h{;8OuIF.P?N܃TW'gBrU*ΏiVM%\R;F:t.1OUq;cjrΝcP.7P!.Z^Ϝ@VꇟF^.w ':>Uu L*:%
0г'HjO!xpjAqܱȠ#kBql6gt@ǦJB@ћځU 89#R{ԜHeKkQ YYp\˱rg@r; k1$F_F~L1d8.Ocr%Z8.B WcHD\dP6c.'Aa5y'ؗ,1Ɵwyp\.y',r h+*J%i%#R8犱7K⟗i=^ Zu[!b]2c\掍dP
q1'1$X=d@wl?ӧRxѶZtℎkzeNރ̄s6/{Sd8Ό7fHjȹ 7hW19;8$g)Cd8Ms'VГc*P.Z*c0f>G``eE%Ͻ18/AÄзsIH.A~L%7^lt5?^N"Zq&(trN{BrsSFsZڱc;G nj>'Ǹqys*jH.A_c0Cgy.&Pt@Ign1~n3#9OWǶdM<;Ǳ@GtxNПc/!1:V c:tȐ@1$gGǐC.HtXѧc}T^xKHAaV'ѻcH~Nd)C286#/HҿcH~ q[$aCHn.ֺ@_2`AwlJ5! cH~ű~k;6˱DANp7G@Ӥ=;qd@<ҹ@FM!9;8 d:/p)W1Q>3@)pr81$5
cH&>ǩ ^)^X~:ⅎՌ;V$~v}Vɰ3cG2B9䅎W҉k
]+=xlh^ c/7:9PysŎqV7;HFuPZkh;V;,drKB#LNos(Wr2"OɡehqL2,\k#pMLr2>_Awq<6Kqt=k:;o$$fR8ޠY c9&[:Izxɣ9Xbxϱ\Qv߲MdMG=gŨ/{b\ǻed19#;&8=v@H؎7e.e~8k8^O Xx8)^~s8"Y OLql.e#oMn8"3EK1rddV8r"YkBαT5V)ګi}FJE\H{kՊ7Vq1"adJ4?`sOy>y\8mvJTo4ێNPE=cz^4lf.zx'>fLT!5iLd]c79m^g Ι\W>
N(ͩ,I&юG
3Im@8]zŲ<J0
ǧBpUkvec
vKw,긼&DA0Ι=:@{;ot"?E?ʆc6aXykv}S7n57$evptKu8BasIt0=/<Zª1y\P*hI_y:撒=ftqtj>?&~6cSK88888dcp1 8I#(IENDB`SummaryInformation(1:DocumentSummaryInformation8BCompObjMj194Microsoft Word 8.0i@N@G(_@9@VMք՜.+,D՜.+,x4
px
Springer VerlagRD!j1Secure Steganographic Methods for Palette ImagesTitle (RZ
_PID_GUID_PID_HLINKSAN{4070FAE60E8D11D3BD22006008D07F7E}Ad3}bpp_vs_noise1_bw.gif
FMicrosoft Word Document
MSWordDocWord.Document.89q2
[8@8Normal $OJQJkHmH `` Heading 1$$$$d@*$
5CJOJQJkHmH ^^ Heading 2%$$d*$
5OJQJkHmH bb Heading 3%$$d*$
5CJOJQJkHmH TT Heading 4$
&F<@&5CJOJQJkHLL Heading 5
&F<@&CJOJQJkHPP Heading 6
&F<@&6CJOJQJkHHH Heading 7
&F<@&OJQJkHLL Heading 8
&F<@&6OJQJkHR R Heading 9
&F<@&56CJOJQJkH<A@<Default Paragraph Font,,Header
p#, @,Footer
p#&)@&Page NumberBO2Btitle$$$$*$
5CJ(OB(author $*OR*address$CJ&Ob&email$CJ>Oq>abstract77dXxCJOp1aJOrJheading1$$*$
@5CJFrFheading2$$*$
@5BrBheading3$$@*$
@5<<equation$xx
]@DOD
figure legend$$xCJDODtable title$$xCJmH:O:
referenceitem
CJ@&@@Footnote ReferenceCJEHH*XXRunning head  left $
]CJ@@Running head  right!$1"Bullet ItemY">TItems#
&
F
>T1B
Numbered Item`$
&F>T.HRH
Footnote Text%Vd$
@CJfbfprogramcode2&$xx
QOM @@@@@@@@OJQJkHRrRFunotentext.Footnote'V
CJ.".Caption
(xx50r0heading4
)@68Y8Document Map*D OJQJ8V@8FollowedHyperlink>*B*>O>Body,$hd$B*OJQJmH nH >P>Body Text 2 $CJOJQJ(U@( Hyperlink>*B*$X@$Emphasis6<<Author
0@& 6CJOJQJ:J:Subtitle1$<@&CJOJQJ
M
M 78G@HKPSc?gr}MY\^_`bcdqrtuvxO,xA;SSSI~dfLN{}O0R000F1H122334444445555&8)8.=0=Q=S=x=z=F>H>DEHFMFGGMMNNN9OTT<_@_bbcggg*p,paydyzz{{HeJ?Ceitԛ՛rv`a67N
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich2C:\Conferences\Infohid01\Lossless\ihw01_loss01.doc
Jiri Fridrich5C:\WINDOWS\TEMP\AutoRecovery save of ihw01_loss01.asd
Jiri Fridrich7C:\Conferences\2001\Infohid01\Lossless\ihw01_loss01.docn^B}F">~Q!֫4FDR(jh$hUt = ]mJ l @ FwHGH oo6
u hob {t n_ T r v *Y
8j \Z ~r 3 30v R! q7! K$ _8$ )$ sm% zc% X% 8& U7) ) N/) v) P*>:L[, T, SZ, T E. f342 #^2 3 fC3 A3ƹfw4Ȯd4&pzX4 ke4 Pm26 $w6~`HRp7 7 M>8 d9 ,'9 R?: h; )< qXo(@hh56>*CJOJQJo() hho(.hhOJQJo(hhOJQJo(zo(zo(.0o(..0o(... 0o( .... 88o(.....
88o(
......
`o(.......
`o(........hhOJQJo(hhOJQJo(hhOJQJo(hhOJQJo(hhOJQJo(hho(.hhOJQJo(hhOJQJo(hh.hho(hho(.0o(..0o(... 88o( .... 88o(.....
`o(
......
`o(.......
o(........hhOJQJo(hho(.hhOJQJo(hho(()hho(.hhOJQJo(hhOJQJo(hhOJQJo(hhOJQJo(hho(.hhOJQJo(hhOJQJo(@hh.hhOJQJo(zo(zo(.0o(..0o(... 0o( .... 88o(.....
88o(
......
`o(.......
`o(........hhOJQJo(hho(.hhOJQJo(hho(@hhOJQJo(hhOJQJo(hhOJQJo(hho(.zo(.0o(..0o(... 0o( .... 88o(.....
88o(
......
`o(.......
`o(........hhOJQJo(hho(.hhOJQJo(hhOJQJo(hhOJQJo(@.hhOJQJo(hhOJQJo(hho(.zo(.0o(..0o(... 0o( .... 88o(.....
88o(
......
`o(.......
`o(........hhOJQJo(hhOJQJo(hhOJQJo(hhOJQJo(rh#^2R?:=WTSZ,:8#I)$_8$mscQmh;R!=nRCY
3rpk)<mwNtl`@hhOJQJo(l @hh.l @hh56>*CJOJQJo() hm @.m @OJQJo(n@DD+((DDM@GTimes New Roman5Symbol3&Arial?5 Courier NewSTimesTimes New Roman71Courier5&Tahoma7Tms Rmn#q8JRFB4LfMքD")~@
#20d!KeͲf)C:\Conferences\Infohid99\ihw99_paper1.dot0Secure Steganographic Methods for Palette Images
Jiri Fridrich
Jiri Fridrich